- If you don’t know what’s really going on, then every attempt to fix a problem will only dig the hole deeper and steeper. That insight was reinforced as I watched a flight attendant follow a simple rule, blindly without thought, while landing recently at Pearson International Airport.
The rule is "Place your seats in an upright position for landing" and the fellow a few seats in front of me, was having a bit of a problem complying with the rule. His seat wouldn’t latch into the upright position and kept falling back to that oh-so-comfortable 10 degrees. The flight attendant blindly followed the rules, and moved him to the seat directly behind the broken one.
Hmm… interesting solution, he was certainly now in an upright seat, but unfortunately he’d been placed into the path of danger. The rule about upright seats is not to protect the person in the seat, but to protect the person behind the inclined seat from crashing into the headrest now aimed at their head. By not understanding the reason behind the rule of upright seats, the flight attendant increased the severity of risk to the passenger.
Nor could I do anything about it. Informing anyone blindly following rules, especially people with bureaucratic power, is just asking for trouble. It’s best to keep your mouth shut and avoid becoming another example of air rage. A basic rule of air travel these days? Do not disagree, never mind argue, with anyone in a uniform while the plane is in motion.
This is perhaps too simple an example to demonstrate the importance of understanding how things work before trying to fix them, when we perceive they’re broken. Consider the following anecdote.
Many, many moons ago there was a university with the following setup. A lecture hall, just across the corridor from a computer room and tutor room. In the hall there were several vending machines. Undergraduates would take their stacks of punched cards (I told you that this was many moons ago), feed them into the hopper and wait in line until the computer printout from their run would stutter out of the printer.
They’d then correct their errors, using the graduate students in the tutor room if they got really stuck on a problem.
The two graduate students were kept reasonably busy throughout the night, but managed to get some of their own work down in the momentary slowdowns of the queue.
Everyone was reasonably happy with this configuration, until a new professor began to use the lecture hall. He couldn’t handle the noise from the crowd around the vending machines and complained to the dean. The dean figured that the solution was easy enough…get rid of the vending machines.
Two weeks later she was visited by the graduate tutors who were complaining about the hugely increased workload and demanding that two more tutors be hired for every shift.
What’s going on? Why the increased workload? The computer courses weren’t handing out more difficult assignments, and the student headcount hadn’t increased.
The vending machines created a gathering place, literally a watering hole, which allowed the undergrads to talk about their assignments. Drinking coffee together gave them time to share ideas and solve a certain percentage of their own problems. Remove the coffee machine, and they all went straight from the printer to the tutor. The result? A perceived problematic increase, in either the difficulty of assignments or student headcount.
Of course, the effect of reduced caffeine intake on undergraduate problem solving capabilities might have had something to do with their demonstrated decrease in problem-solving ability.
There are far too many examples of how not understanding how a system works, results in larger problems when we try to fix or change things. We need only look to the rabbit problem in Australia, or the World Health Organization’s "Operation Cat Drop" (look it up) for almost hilarious case studies.
Before we attempt to fix any problem, it’s imperative that we understand not only the perceived problem, but how the system worked before we determined it was broken. To do that we need to take a systems view of the world, otherwise every time we touch the Web of interconnections we’ll awaken a sleeping spider, one much worse than the one we tried to fix.