You’ve probably experienced it: tomorrow is the long-awaited date of a new software rollout and the server fails right when you and your team finish. Backup? Of course! From yesterday evening. But most of what was done today is gone. Or think of the problem with printers – I get the feeling that mine breaks down right when I don’t have time to repair it or to find a replacement. Phenomena like this are often labeled with Murphy’s law.
Technology and its systemic independence
Those who work with a large, professional data center with mirror servers, network disks and heavy-duty printers will say that this doesn’t happen to them. Well, maybe you then recognize this situation: Just as you and your team are finishing up, a software error destroys the entire database, including the mirror. Of course, you have an online backup, but how long has the cause of the error been in the system? Anyway, there’s not enough time between now and tomorrow. “Mh,” you could probably counter me “Technology is always improving and becoming more fail-safe, and this sort of thing is going to be eliminated over the next few years.” I think so, too. ATMs show us how reliable technology can be – all the while, technology is increasingly pervading our everyday lives, and getting more and more complex. This means that it will, from a systemic point of view, continue to develop more and more into its own independent system – and systems don’t behave deterministically. In other words, the behavior of technical systems will become an important factor in things happening that we can’t predict (uncertainty through technology).
“Now,” I hear a lot of you saying, “that’s nonsense. We just have to know more, then we can control everything” and “Systemic thinking is well and good, but it applies to people; such systemic principles and assumptions concerning technology are certainly going too far, they don’t fit the context, they are almost esoteric. We just have to make sure we know enough to properly plan and control things and then we can keep technology under control.” But will we really? Under the leadership of Fritz Böhle, the Munich Institute for Social Research has already been researching the area of unpredictability and how to deal with/in such situations, and they have empirically reached the conclusion that technology is a significant factor for unexpected events, on a par with the human factor .
Risks and side effects of digitization
In the times of digital change, this is not easy to accept. Like many others, and as a computer scientist, I have been hoping for years that technology would make our lives better and easier. Of course, it does this in many ways, and at the same time we have to acknowledge that it also triggers unrest and uncertainty. This is not only particular to digitization, but also to other innovations. For a long time, there is a clear improvement and then we start to notice unexpected side effects. We discovered antibiotics and use them to heal many people and then suddenly resistant strains emerged and people are dying again. We discovered nuclear power, used it for energy, but the waste from it pollutes the earth badly despite our best attempts at containing it. It is always the same story, and the same rules apply to digitization, too. Technology penetrates the whole of society, but we are still perplexed by the unexpected side effects. No wonder many people are opposed to digitization.
The obvious escape is to give more consideration to risks and developing countermeasures. Good (project) management has always done this. Project management is largely about introducing innovations. If we then also use agile techniques, we can recognize both potential mistakes and new opportunities early on. And this way we avoid acting negligently. But the “unknown unknowns” remain and cannot be planned or controlled, so we will never be able to control the technology entirely, it will develop “its own life”. This feeling has been creeping up on us for a while. In Goethe’s poem, The Sorcerer’s Apprentice, the narrator laments: “Spirits that I’ve summoned / My commands ignore.” And yet we hope that we will be able to control these spirits if we just do “more of the same”: gain more knowledge, better planning, stricter control. We don’t dare to put it bluntly that there will always be a limit to our intellect which we will never be able to overcome with rationality. (In the GPM expert report “Dealing with uncertainty in projects,” this rational process is called “objectifying action.” It’s nothing more that activities like gaining knowledge, planning and controlling.)
Gaining the ability to act
There are probably many reasons for holding back. One key point is that Western society has placed great importance on rationality for hundreds of years and devalues things that cannot be explained rationally. Rational understanding gives us a feeling of security. Guest author Ralf Schmitt recently described our instinctive need for security and the fear linked to rapid change in his blog post “Don’t worry, bunny“. Understanding controls fear. But as Ralf Schmitt wrote in his post, “security is an illusion”. I would expand on this by saying “predictability and controllability of technology is an illusion”. Here comes VUCA.
I agree with a lot of what Ralf Schmitt said in the blog. And I think it’s natural to overcome fears and uncertainty with rationality in times that are characterized by apparent rationality due to increasing digitization. Rationality creates inner acceptance: We recognize that we are deceiving ourselves into believing in a false security and that’s the prerequisite for reflecting what we are doing and therefore – perhaps – making a change. But when it comes to dealing with/in uncertainty, rationality is only one aspect and to keep the ability of acting when unexpected events pop up needs more than just “more of the same rational approach”. Rationality is not enough. It’s like with new years’ resolutions: It’s all well and good to rationally decide on them (in the sense of acceptance), but it’s not enough for a sustainable implementation.
To remain capable of acting in the times of VUCA, fears must not turn into panic, nor stability into stiffness. But our brains are well trained to do so in case we get overwhelmed: we think and think, we get limited, suppressed fears get louder. The rabbit inside us let’s us freeze or we blindly strike out around us – just think of these jokes where computers are thrown out of the window in case they don’t do what we expect them to do.
Skills for a digital transformation
Are you asking yourself what we need to “survive” the digital transformation – beyond a fundamental understanding of technology and cognitive reflection on our reactions to uncertainty? And how we should cope with the pace of change? We need something that will still be accessible even when digitization rolls out across our workplaces and private lives – with all its consequences.
We need an attitude that connects new forms of security with creative enjoyment. Because playful joy and curiosity about new things can only happen if we are not feeling fundamentally threatened. If outside circumstances become more uncertain because changes seem to be increasing, then it’s not enough to reflect cognitively that we are not really in danger at that moment and we live in the best times and societies that have ever been. It is more about security that comes from a type of self-awareness in the truest sense of the word, not about the security that is based on outside circumstances. This security is the basis for an adequate approach to new things. These things are not really found in our brains. They are found in the body (there are many examples in our language, eg. “we stand something” in the sense of tolerating, “we move forward” in the sense of progressing) And maybe the statement of Ralf Schmitt could look something like this: External security (certainty) is an illusion, to pass it, inner security is required (inner stability in flexibility).
To learn more about the opportunities for having the ability to act beyond rational planning and control, have a look at my blog post “Once upon a time, there was a project plan” or visit a workshop on the 26.5 or 25.6. in Berlin.
 Expert assessment “Umgang mit Ungewissheit in Projekten” from GPM, November 2016, published online in German only at: https://www.gpm-ipma.de/know_how/studienergebnisse/umgang_mit_ungewissheit_in_projekten.html
 Johann Wolfgang von Goethe: The Sorcerer’s Apprentice