In a somewhat controversial post at his Ground Floor BPM blog, Scott Menter of BPLogix suggested that simulation in the Business Process Management (BPM) world is a non-starter. While his claim that no one in BPM uses simulation is surely a dramatic generalization, he does offer some rationale for his assertion:
Simulation delays automation, and I don’t like to wait. There is an instant benefit from automation, even if the underlying process is not very efficient. To put it another way: I’d rather automate a poorly designed process today than spend six months analyzing, simulating, and optimizing it before automating.
For the simulation and business process design advocates out there, this amounts to paving the cow path, when what you should be doing is using simulation and/or other analyses to study and improve the process before automating it:
Scott’s article offended my sense of logic for exactly this reason, yet something about what he was saying resonated with me and my real world experience in applying simulation to business processes. My first reaction was that Scott was simply wrong. In fact my first encounter with automation, an imaging and workflow implementation, had been a complete failure which we attributed to not engineering or improving the process first. However, that was over 15 years ago. Times have changed, and the more diplomatic side of me tended to agree with some of the commentators: there is no right or wrong answer, sometimes simulation makes sense and sometimes, perhaps not. But then I got to thinking: of course there is a right answer. This is business after all, and the right answer will be dictated by cost benefit.
But why the strong feelings? Why is it that Scott’s article offends? I think one has to look at the history of BPM and simulation to understand this, and I’d like to consider a specific attribute of each:
- Business Process Reengineering (BPR): Many still associate BPM with BPR. The mantra in BPR is that one does not pave the cow path. Analysis, engineering and improvement of the process comes first followed by automation. And simulation fits perfectly: it is less expensive to experiment with proposed process changes on a simulated system than the real thing!
- Manufacturing: Computer simulation technology has it’s roots in manufacturing and the practitioners are traditionally of the industrial engineering type. While the technology has been adopted to handle simulating (service) business processes, the roots of simulation technology and the mindset around its use is often from a time and a place where the costs of automation were very high (think factories and plants with expensive super specialized hardware and proprietary domain specific systems).
Within the context of contemporary BPM, both of the above should be called into question. Let’s look at each in turn.
The desire not to pave the cow paths is so appealing because it is intuitive. It’s a key catchphrase of the business process reengineering movement, and the actual passage is from Hammer and Champy’s Reengineering the Corporation: A Manifesto for Business Revolution. However, “it’s time to stop paving the cow paths” is only part of the quote, and like anything taken out of context, it potentially misleads. The actual passage is:
It is time to stop paving the cow paths. Instead of embedding outdated processes in silicon and software, we should obliterate them and start over. We should “reengineer” our businesses.
BPR is about starting over: radical redesign where what replaces an existing process looks nothing like what is currently in place. I would argue that BPM really has little or nothing to do with BPR. Global360 makes the following point:
…while BPR is meant to be disruptive and involves completely re-thinking processes, BPM has its roots in gentler methods. It’s intended for continuous improvement of processes and its evolution was driven by specific technologies. The focus today is on improving productivity of the workers you already have, and making it easier to roll out new business processes or new products while taking advantage of existing IT systems.
If anything, the desire is for business process improvement, incremental changes and optimizations to make the existing process run better. So yes, we are deliberately paving cow paths and there is no mandate to rip out the current process. BPM is not BPR. It’s not necessarily even business process improvement. The business case may be predicated on automation alone. The costs of experimentation in the real world have come way down. How is simulation a requirement for automation?
The number of solutions and the relative ease of deployment (compared to yesteryear) of BPM systems have radically lowered the cost of BPM implementation. When the costs of automation, whether through poor processes or implementation, are perceived as so high in the traditional simulation mindset, the costs of developing and maintaining meaningful simulation models is not considered material in the big picture. However, there are definite costs associated with simulation:
- Analysis paralysis: this is where Scott was apparently coming from in his article. The desire to gold plate a process design can lead to a delay in automation and any associated benefits.
- Talent: there is specialized training and capabilities required to create meaningful simulation models.
- Data acquisition: while a process description is required for both automation and simulation, simulation also requires a typically large data set of parameters or inputs such as volumes and their arrival patterns, task durations, resource availability and so forth to ensure the model generates meaningful and accurate results.
- Model maintenance: if you are ever going to use the simulation model again, chances are you will have to actively update the parameters described above.
These costs are not trivial.
So at this point you may be thinking I have gone over to the dark side and joined forces with Mr. Menter. You’d be wrong. While, for example, the cost of data acquisition for analyzing processes is low for automation when compared to simulation, simulation has a much higher predictive capability. The kinds of future looking and what-if capabilities available in simulation are simply not available from automation alone (that’s not to say automation alone has no predicitive capabilities based on the data it provides, it is just less):
What this suggests to me is that the traditional use of simulation for process design is misplaced in the BPM context and that the real benefit of simulation within BPM is in the predictive capabilities and prescriptive analysis it provides as a consumer of automation data. When systems are already automated, data acquisition and simulation model maintenance costs come way down. In most cases it makes more sense to employ simulation after automation, not before it. While simulation may still offer benefits as a process design tool (for example, even with automation, if the process simply does not yet exist, the case for simulation as a design tool is strong), the real story is the use of simulation after automation:
In this configuration, the costs of simulation are driven down while the accuracy, and therefore predicitve ability, of simulation models increase:
Why is this so? Automation collects the data required by simulation models. Automating the collection of data and in some cases the baseline process definition itself through process discovery reduces the need for specialized talent. Automation also provides a means of providing the data required to maintain the models for ongoing use (both simulation parameters and process model extension). Lastly, data collected from an automated system is more accurate than data compiled by hand for static, steady state simulation models, which leads to better simulation results.
Simulation, when used in this way, is positioned to provide predictive capabilities for BPM systems that are extremely valuable to management. It’s also consistent with the latest trends in BPM: specifically the process prediction component of process mining and prescriptive analytics answering such questions as: When will my process end? How do I best schedule and assign my staff to handle the anticipated workload?
This article is actually part 4 of my Process Event Streams: What You Need To Know series. I thought I’d be a little more creative with the title of this one. You might want to read part 1, part 2 and part 3 in the series before reading this post.
I’d like to make two propositions:
- Proposition 1: A simulation scenario can be defined entirely by a set of events.
- Proposition 2: The output of a simulation scenario is a set of events.
Taken together: a simulation model is a function of a set of events and results in a set of (virtual) events:
In this article, I’d like to discuss the first proposition. What this proposition says is that all the inputs or parameters required to specify a simulation model can be defined by a set of events. Practically speaking, some of the events will have been abstracted into a format that is more natural for describing some of the simulation model parameters, however, these abstractions have their basis in sets of events.
One way to explore this assertion would be to identify the components of a simulation model and see if they can each be related to a set of business process events. The Sim4BPM proposal formalizes what defines a simulation model and so we can explore each component of the Sim4BPM specification as a means of organizing our thoughts:
This is specified as a diagram or formal description (e.g. a BPMN diagram) and represents a case where events have been abstracted into a more natural format that is easier to grasp and/or specify: a process model. To demonstrate that a process model is in reality ultimately defined by a set of events, we turn to process mining. One outcome of process mining is the use of process event logs for the purposes of process discovery. In this case, there is no existing description of the business process, so event logs are used to ascertain the structure or a description of the business process. In cases where process discovery is not, or cannot be used to define a business process, it seems reasonable to assume that our ability to manually model a process is based on a knowledge of the events that occur, or could occur in a business process. When someone manually creates a flowchart of a business process, I believe they are intuitively walking through the events that can occur in that process and abstracting that into a diagram or description.
It is not difficult to see that the availability of resources are based on events. For example, the events of coming on shift and going off shift. Proficiencies at a certain task or set of tasks is based on the historical productivity of the resource at a task or set of tasks. This is calculated based on a set of events corresponding to when work at these tasks was initiated and completed when being performed by each specific resource. Even the skill sets of resources can be ascertained by looking at the events corresponding to work of a given type at a given task being reserved by a specific resource.
Things like activity durations and the routing of completed work are based on the relationship between events. For example, activity duration can be calculated by using the initiation and completion of specific instances of the activity, and the relative number of post processing outcomes. Data associated with these events (or data associated with the process instances or activity instances associated with the event) suggest the condition or business rule for doing one thing vs. another after a given activity is completed.
In the Sim4BPM specification, event parameters represent where tokens of work need to be injected into the simulation model. For example, the arrival pattern of work, and the initialization of the model with work in progress are all cases where tokens are injected into the model. The relationship to business process events should be obvious and is pretty much one to one. In particular, these parameters are based on events that originate outside the scope of the model, but effect processing inside the model.
Border Events vs. Modeling Events
One distinction that is useful to make is between events that originate outside the scope of the model, but whose occurrence directly effect processing inside the model, which I will call simulation border events, and events that are used to model the activities and processing rules inside the simulation model, which I will call simulation modeling events.
Border events consist of events which cause the injection of a new token into the simulation model (the Event Parameters in Sim4BPM) and resource schedules, which in a way cause the same thing – the injection of resource availability into the model.
Modeling events can be based on a set of historical events, predicted events, or imagined events. These events are used as the basis for defining the process description, the various activity parameters, and resource proficiencies. In the diagram above I have illustrated past events as being the source of modeling events used in specifying the simulation scenario. This might represent an existing process description and resource capabilities being used as the basis for future simulation scenarios. The modeling events could easily be based on imagined events in the case of a process that is being designed from scratch.
Coming up: I’ll discuss the relationship between the kind of analyses that may be performed with simulation models and the placement of the simulation scenario event space on the time dimension. In doing this, I’ll be discussing the second proposition: that simulation results are a set of events.
I have advocated that the concept and potential value of business process simulation is easy to understand. Yet, if simulation makes so much sense intellectually, why aren’t business managers using simulation all the time? I’ve outlined the deficiencies of simulation, and the Sim4BPM effort is meant to help address these. In the current state of business process simulation, what managers ideally want is to do something as simple as click a button and automatically simulate their business processes. What they actually get is a rather non trivial exercise, requiring specific skills sets, to create and maintain meaningful simulation models.
If you step back, what one realizes is that managers don’t actually want a simpler way to simulate their process. Nor, I suspect, do they want a simpler was to automate their processes. What the really want, quite simply, is answers. How do I process this item at the least cost? How much staff do I need to manage the workload and meet my deadlines? What is the best way to order or schedule the work? Could I operate with less equipment, and if so, how much less? Etc.
Simulation, even if it was dead simple, is only a means to an end. So is BPM in general. Managers don’t actually want either of these. They want the easiest way to get answers to the specific problems and challenges of running their business. Failing that, they want the analytics capability to find those answers. Failing that, they’ll implement the tools to generate the data to get the analytics to get at the answers…
Simulation is but one tool, albeit a useful and powerful one, that generates the kind of data required to perform the analytics required to provide the answers business managers need. It’s kind of low on the food chain. It doesn’t mean it’s not important or useful, because it is both. It’s just that managers don’t want simulation, they want answers. They might need to use simulation (or job scheduling software, or statistical forecasts, or…), but they are a means to an end. Something to keep in mind.
Tools and technologies that provide business process answers and not just analytics can often be characterized by optimization methods that identify a best (optimal) configuration. Simulation, in and of itself, is not an optimization tool. Instead it measures how changes in a business process’ parameters affect the behavior of the process over time. However, simulation technology can be used as an input to optimization methods. For example, Meta Software aims to answer the question what is the optimal set of staff schedules given the business process and staffing constraints? This is accomplished by using simulation technology to model and predict workloads that are input into workforce management technology that optimizes scheduling against those workloads. Robert Shapiro and Hartmann Genrich have developed technology that can optimize process structure. Simulation as an enabler of their solution. In both cases, simulation is very important and useful, but it’s not really about the simulation.
For the purposes of full disclosure, you should know that I currently work for Meta Software. If you or your company is also using simulation as an input to optimization technology to provide answers as opposed to analytics, let me know and I’ll make sure it gets mentioned.
Photo Credit: Caro’s Lines.