The front cover of The Character of Harms has a picture of a knot on it (see Fig. 1). I’m particularly proud of this photograph, because I took it myself. What’s special is what happens when you’re confronted with this image. I know what your brain is doing, because you can’t help yourself. You are naturally drawn into it and you start by conducting stage one epidemiological analysis of this object, assuming it’s a bad thing to be undone. That means figuring out the structure: the way it is. Or the way it works. Its components, structure and dynamics. Many people move on immediately to stage two epidemiological analysis, which involves figuring out the weaknesses of the thing itself. If I asked you to untie it, you’d need to know its vulnerabilities. Which strand would give way first, or most easily? What’s your plan for unravelling it? You might carry out a little experimentation along the way, but if you had really understood the structure and figured out the vulnerabilities of the risk enterprise itself (that’s the analogy) then your analysis of the structure of the thing leads you to invent a tailor-made solution for undoing it. This is a simple physical analogy for problem solving. I wanted an image that would trigger all those mental processes. This knot is about the right level of complexity to grab your attention.
Now, when we move from the field of individual cognition to organizational behavior, things get more complicated (see Fig. 2). This is the chart that I’ve been using since 1998 just to illustrate the important distinction between problem centered approaches and program centered approaches. There are some big and general classes of things out there in the world (bottom right quadrant) that we should worry about. It might be international trafficking in nuclear materials, or in women and children, or in drugs. Or it could be violent crime or political corruption or environmental pollution.
Once society becomes sufficiently perturbed by the class of risk, we invent a new government agency, or control operation, which needs a strategy. Before long a central idea emerges: a “general theory” of operations. This is how, in general, we will deal with this broad class of problems, and we build big machines. The machines (which sit in the top right quadrant) are programmatic. They tend to be either functional or process based. Those are two quite different ideas in organizational theory. We know about the value and efficiencies of functional specialization; and the use of specialist enclaves as incubators for specialist knowledge and skills. Those lessons date back to the industrial revolution. Then, in the last 35 years, we learned the importance of managing processes. These are high-volume, repetitive, transactional; and frequently cut across multiple functions. The public sector learned process management from the private sector, with a lag of roughly 5 years. But by now we’ve mastered it. Process management and process engineering methods are commonly applied when we set up or want to improve systems for emergency response, tax returns processing, handling consumer complaints. Because such public tasks are important, high-volume and repetitive, it’s worth engineering and automating them, using triage and protocols to ensure accuracy, timeliness, and efficiency.
In the top-right hand quadrant we therefore have major programs—organized either around functions or processes. But large bureaucracies have then to divide up the work and hand it out, often across multiple regions. So, front-line delivery by major regulatory bureaucracies is ultimately performed by functional units and process operations disaggregated to the regional level (top left quadrant). That’s the quadrant where the rubber hits the road. That’s all good, and that’s the way major agencies have been organized for decades.
This audience knows the various general theories that have come and gone in policing over time. The “professional era” of policing relied on rapid response to calls for service, coupled with detectives investigating reported crime. That was the “general theory” of police operations for a good long time, until of course it eventually began to break down.
Environmental protection has had its own “general theories” too. If you assume most pollution comes from industrial plants, then the general theory of operations is to issue permits to industrial facilities (to allow them to operate) and attach conditions to their permits governing various discharges—smokestacks, water pipe discharges, transportation of hazardous waste. Environmental agencies then monitor compliance with the conditions on the permits. That general theory works fine for many pollution problems, but then environmental issues appear that have nothing to do with local industrial facilities: radon in the homes, sick office buildings, airborne deposition of mercury originating from Mexican power plants, importation of exotic species. These problems don’t fit that general model.
At that point regulatory organizations realize that their big programmatic engines cover many things but not all things. Eventually an alternate operational method emerges, which depends neither on general theories nor on major programs. Examining the general class of risks, an agency realizes “this isn’t one problem, but at least 57 varieties”. Then begins the work of disaggregating risks, focusing on specific problems (bottom left quadrant), studying their particular structures and dynamics, leading to the possibility of tackling them one by one. Spotting knots, if you like, and then unpicking them. When an agency operates that way, they invariably end up inventing tailored interventions for carefully identified harms.
I drew Fig. 2 in 1998 for inclusion in my book The Regulatory Craft. It was clear at the time that many celebrated innovations in public service involved organizations designing and implementing tailor-made interventions to carefully identified problems. Other innovations emerging involved the disaggregation task itself—where the big arrow sits in the middle of the bottom row on the chart. The type of work happening here involves analytic systems, data mining systems, anomaly detection systems, sometimes intelligence systems, learning from abroad, imagining problems that you’ve never seen, and the ability to spot emerging problems quickly. I call these, as a general class, “vigilance mechanisms”—the methods you use to discover problems or potential problems that you might not have known about if you hadn’t deliberately looked for them.
The point I want to make with this chart is simply that these two methods, program centric and problem centric, are quite different. Program-centric work is extremely well established and very formally managed. Problem-centric work in most professions (including the police profession) is relatively new, in many cases quite immature, and often not formally managed at all.
The nature of work, and working methods, are different in the top right quadrant from the bottom left. Figure 3 illustrates the types of task statements that might appear in the program-centric space. I’ve picked a miscellaneous collection of tasks from different fields. These task definitions are very specific about the preferred programmatic approach: negotiated rulemaking, three-strikes policies, drug-awareness resistance programs, etc. But they are somewhat vague about the specific problem being addressed, or the range of problems for which that program might be relevant. That’s characteristic of the way work is organized in the top right-hand corner.
Figure 4 illustrates task statements that would fit in the problem-centric space (bottom left). In selecting these I have disciplined myself to use roughly the same number of words and all the same domains. These statements are much more precise about the specific problem to be addressed. They each say nothing (yet) about a preferred solution, because stage one epidemiological analysis seeks first to accurately describe the problem and figure out how it works, before considering plausible solutions. These are each draft, summary, problem-statements, taken from different fields.