Some strategy folks in an enterprise architecture group recently asked for help making rules more relevant to their organization. Their concerns ranged from when to embed rules in their middle tier versus encapsulate them within services to identifying ideal use cases and reference implementations. They were specifically interested in coupling rules with BPM and BI.
Such questions occur every time a group or enterprise considers adopting rules technology for more than a specific application. They are looking for guidelines, blueprints, or patterns that will help them disseminate understanding about when and how to use rules. They have adopted a BPM vendor which will be integrated with their selected rule vendor, each as enterprise standards, so they are particularly interested in the integration requirements between the two.
Two high level understandings are critical for success in furthering adoption of rules technology.
- abstract activities for which rules technology well-suited and
- when and why rules technology is better than familiar alternatives
Because they have an established process mentality, I limited my advice on abstract activities to decisions, governance, risk and compliance (GRC). Being in IT, their comfort with procedural coding biases them to overuse BPM versus rules. So, I spent some time on rules versus process or procedures.
Hopefully, capturing some of this here will be of use to you.
Decisions and Compliance
A colleague recently suggested that becoming decision-centric will facilitate the broader use of business rules. I agree. Certainly, to the extent that people focus on decisions they will be focusing on areas for which business rules will be relevant. However, other approaches are also relevant to decisions, especially business intelligence (BI), broadly speaking.[1] Combining business rules and business intelligence in a decision-centric manner is the thesis of enterprise decision management.
The EDM perspective is helpful in relating business rules to BI but it does not cover several important bases. EDM is largely silent on – if not confused by – its relationship to business processes[2]. The silence seems to arise from a focus on service-oriented architecture even though decisions are made within the context of a business process. Perhaps if the larger BPM market was focused solely on business rules for decision making there would not be confusion. But there are driving forces such as compliant governance that do not fit the EDM mold.
BPM experts know business rules are critical
Every significant BPM vendor claims to support business rules. Some, such as EMC Documentum or IBM/Filenet or TIBCO address the issue through partnerships with multiple vendors using optional “connectors” to Ilog, Fair Isaac, or Corticon. Others, primarily Oracle/Siebel, do so through OEM relationships in which they commit to a single vendor, such Haley. Some, such as Pegasystems, provide their own capabilities.
And now SAP is upping the ante with its recent acquisition of Yasu.
Still the 80/20 of BPM is the process not how decisions are made.
Since the BPM market substantially overlaps the application development market, BPM remains significantly IT-centric. Consequently, BPM still often follows a largely waterfall development process in which decision making criteria flow out of the business and into IT. Given the prevalence of procedural skills within IT, such criteria are still too frequently transformed into flowcharts or code.
All too frequently, users unwittingly sacrifice the time to market, flexibility, agility, and accessibility advantages of externalizing logic from process.
The bottom line is that BPM is not advancing the broader use of business rules. Rather, BPM vendors are merely responding to those who understand that they have decisions to make within their processes and that rules are more appropriate for managing those decisions than code or flowcharts.
Too few understand where the lines are
In one way, EDM may actually confuse the issue of when to use rules. Unlike BPM, where the confusion involves procedures, in EDM it involves models, such as scorecards.
In the case of BPM, this is a typical question:
“How many or how complex or dynamic does my decision logic need to be to justify using rules rather than expressing the logic in a flow?”
For EDM, typical questions might include:
“How do I decide whether to use rules or a scorecard?”
“How do I decide when to write rules versus add variables to scorecards?”
The most concrete advice for EDM is that if the logic is Boolean (e.g., arises from a requirement, regulation, or policy with which compliance is mandatory), rules not scorecards are appropriate. The next most concrete advice is that if using a rule allows the scorecard to obtain better predictive reliability, it is appropriate. This advice is easier conveyed than realized, however. Interrelating rules and scorecards can affect sample sizes and statistical validity, which is critical.
There is some specific advice on when and how to use rules for scoring below.
There are applications of rules beyond decisions, as mentioned above and discussed further below, but even if our business process includes decisions, how do we know when to externalize those rules, such as in a decision service that will be orchestrated by a business process engine?
Consider procedural bias
The key tradeoff is when to use rules versus procedures. IT has many reasons for bias in favor of procedures. Although difficult, we must strive to overcome bias with knowledge and objectivity, otherwise we will use one or the other inappropriately.
The most common and powerful bias is familiarity. 100% of “us” can take a snapshot of a business process and reduce it to algorithms + data structures. It might be a lot of work, but we “know” we can do it. We cannot understand how much it will change, in part because we don’t know what we don’t know. And we cannot say how hard it will be to maintain because we cannot comprehend all the detail until it is implemented and working. Nonetheless, we are confident that we can do it. Few of us are as confident with rules.
More often than should be the case, this bias leads us to flowcharting which becomes inaccessible as it reflects increasingly deep analysis and refactoring and increasingly complex flows, whether codified or visualized in a BPM tool.
To overcome this bias, why not turn it on its head? Why not ask whether it should be done in flowcharts!? The first thought might be “yes, because I can” or “yes, because I know how”, but such answers do not reflect objective consideration of alternatives to select the best approach.
Frequently, the answer will be obvious, especially if the flowchart is trivial. Quite often, there answer will not be so immediate.
Blueprints for rules
There are obvious heuristics for when rules are probably efficacious. Some of them are task-centric, such as in decision making or compliance. For example:
- if there are many criteria (or reasons) for qualification (or disqualification)
- if there are many exceptions indicating problems
(e.g., the unexpected or a lack of compliance)
In the first case, the decisions are among two mutually exclusive alternatives. This is the easiest case for business rules. This case is further improved:
- As the number of criteria increases
- As the set of criteria becomes more dynamic
- As the criteria become more complex
The last point on complexity of the criteria also relates to which non-procedural approaches should be considered. Rules can be implemented in various ways, from simple database lookups or spreadsheet metaphors to formal logic (e.g., relational algebra or predicate calculus) or even using natural language. For the most part, selection should be driven by the underlying information (e.g., whether it involves relations that are more than one-to-one) and the user community (e.g., whether they are productive with formalisms versus natural language).
Rules are great for decisions
Decisions among more than two mutually exclusive alternatives are also straightforward applications of business rules. The usual paradigm is to capture the necessary, sufficient and contradictory conditions for each outcome. Depending on the vendor, this logic can be implemented so that it determines what decisions are indicated or contra-indicated.
Typically, the logic is intended to reach exactly one decision outcome. Applications usually have a default outcome if no decision is indicated. If multiple outcomes are plausible, additional logic may prefer one over the other. Such preferences could be arbitrarily complex or a simple ordering of outcomes.
Blueprints for consistency
For example, if there is a conflict between approving and denying a loan (i.e., both outcomes are indicated), most creditors would refer the case for manual review while others would deny it outright; none would approve it. If such further deliberation results in no single outcome, the automation (i.e., the rules engine) should raise a runtime exception or otherwise respond “no decision”. Such cases indicate either:
- a necessary or contradicting rule is incorrectly expressed
- the knowledge is incomplete (an additional rule is needed)
On the other hand, decisions might not be mutually exclusive, such as which literature to include in a package. This might be equivalent to making decisions about whether to include each piece of literature in the package.
If the logic implies that an alternative is both indicated and contradicted,[3] there are various possible approaches. If an outcome that would be preferred is also indicated without contradiction, the runtime decision may be unaffected. If not an exception or failure to decide should be indicated, as above. In either case, such contradictions indicate incorrect or incomplete expression, as above.
In practice, the number of runtime exceptions approaches zero quickly. As things change (internally or externally) their frequency may escalate, however briefly. Some vendors, especially Corticon, emphasize “compile-time” determination of incompleteness or inconsistency. Unfortunately, such analysis requires that the expressiveness of the rules be limited[4] regrettably, such limitations preclude the use of such tools for general purpose logic.[5] If runtime exceptions must be zero and the logic is not complex, however, the spreadsheet metaphor shown below may suffice (and is easily mastered).
Rules are great for recognizing exceptions and compliance
The second case above, recognizing exceptional circumstances, uses the same exception mechanism or “no decision” outcome that your architecture should anticipate concerning incomplete or inconsistent logic (as discussed above).
Exceptional circumstances include rules that generalize (i.e., can do more than) referential integrity as it is supported in databases, for example. For example, “Don’t place orders that cost less than their shipping costs.” Depending on your knowledge capture and automation toolset, you may have to translate this to “an order with cost of goods less than estimated shipping costs may not be shipped”.
Exceptions are quite common in compliance applications. Most regulations are expressed as requirements. That is, they tend to say what must be not what to do. Any such requirements must either be transformed into deductive rules or operators that take action. Typically such transformations involve replacing “must”, “may”, “shall” or “will” and introducing “if” or “unless”. Unfortunately, transforming a requirement often results in more than one rule.
Analytic transformation of requirements into business rules that are more than one-to-one or that affect multiple classes, tasks or processes is a clear indication for using rules technology.
Note that policies may also often be expressed as requirements but they are also frequently expressed as rules. So in addition to business requirements and regulations, policies may also be enforced using exceptions or require transformation.
Maintaining exception logic as rules allows exceptions to be recognized and compliance to be enforced throughout a business process without needing to express how (or that) requirements, regulations, or policies are (or need to be) checked (redundantly and distinctly). That is, factoring out requirements (including most regulations and some policies) from a flowchart increases productivity and agility dramatically.
Rules are pretty good for scoring
We discussed decision outcomes such as mutually exclusive or multiple choices above. Rules are also very good for contributing to scores. In lending or insurance, for example, there are hundreds or thousands of indications of risk involving combinations of demographics, claim history, property, health, financial or other information. In general, these risk factors are expressed as “if A, B, and C then the risk of X is R”. That is, each of them might be viewed as a rule. The outcome of such rules is not a decision or an exception, however. It is a score used by further logic, which may be further scoring rules or thresholds, such as “any risk above today’s risk threshold” is unacceptable”, or analytic formulations, such as min/max, averaging, scorecards or more complex algorithms.[6]
The key point here is that as subject matter experts realize heuristics or as correlations with risk, profitability or other key performance indicators are discovered, the use of rules facilitates a scalable scoring architecture.
Don’t consider rules if the process is trivial
Sometimes it is easier to know when not to use a technique. Do not use rules if both of the following hold true:
- the flowchart has a handful of branch points
- the logic is fully understood and will not change[7]
Note that I am not saying that rules should be used if the flowchart has a dozen branch points and will undergo some clarification and change. Rules are clearly indicated, however, as the conditional complexity and the likelihood and amount of change increase.
Although seemingly incredible, the first guideline is realistic. Even the most complex algorithms have very few loops or branches. Of course, one algorithm may invoke sub-routines, but each procedure should have a very simple flowchart. Intuitively, if the flowchart is too complex to visualize mentally[8] then it will be too complex to understand and maintain, whether in code or a BPM tool.
A complex business process may have many interrelated flowcharts, but each of them should be very simple. If a business process flowchart has dozens of branch points there is a problem! The more tangled it becomes the more difficult to understand and by fewer people making such a process more costly and harder to maintain reliably. Also, each flowchart should correspond to a business reality. If the flowchart results from technical analysis and transformation of natural business logic but doesn’t make intuitive sense from a business perspective, expect problems.
Encapsulation works better in services
One common discussion concerns encapsulating business logic (i.e., rules) within objects or services. Services make sense for decisions and compliance that are orchestrated by the business process. Encapsulation within classes generally does not work unless it involves a very small number of classes and meets the criteria of simplicity above. Neither works well unless the logic also meets the criteria of locality. If a requirement, regulation, policy or rule spans many classes in the model and many tasks within the process, externalized rules are strongly indicated. If the span is limited to a very small number of classes or tasks (i.e., 2), encapsulation in objects or services may be sustainable but redundancy and oversights (i.e., errors) will be much more likely at all times.
The state of integration between rules and BPM
The reality of today’s market is that business rule capabilities captive to BPM vendors do not have the accessibility, manageability, functionality and performance of those from the dedicated rule vendors mentioned earlier. And the partnerships between top-tier BPM vendors and rule vendors are not integrated deeply enough.
For example, typical business process integrations reuse rules only if users organize them into packages and specify which packages should be considered per invocation (i.e., per node in the process diagram). In addition, modeling information is shared poorly, if at all between the BPM and BRMS tools.
Oracle’s integration of Haley was deeper than the loose partnerships between other leading vendors, but Siebel is not a BPM platform per se and even that integration has problems. As of version 8.0, the integration of models is weak and the generated natural language is stilted at best. In addition, users need to specify which data to pass for each kind of decision.
The limitations of integration capabilities certainly impact when to use rules in addition to a BPM platform. The built-in capabilities of some vendors may be the most viable alternative even if their capabilities are weak relative to pure-play rules vendors. To do so exacerbates vendor lock-in concerns, however.
I plan to write separately about the requirements for effective integration or BRMS and BPM capabilities. Such integration will become less relevant to Oracle and SAP customers within a few years since they are each deeply integrating BRMS into Fusion and NetWeaver, respectively.
Expect Oracle and SAP to pressure other BPM vendors to acquire or align more closely with BRMS vendors in order to improve and simplify model and runtime integration.
Expect integrations to focus on the EDM approach. The support necessary for GRC, as described above, will never be supported by loose integrations.
For now, JBOSS Rules may have the best integration.
[1] including statistical, neural, or other machine learning techniques and their predictive application[2] as James Taylor describes well in Process Management and Decision Management[3] In such cases, the logic is called “inconsistent”[4] See description logic, e.g., as in OWL-DL[5] See here re decidability of first-order logic and the discussion of OWL-Full here, for example.
[6] Such as the Subjective Bayesian Method commonly used for uncertain reasoning.
[7] Most real-world systems have incomplete knowledge of ill-structured problems. This understanding dates back to Newell and Simon but here is some more recent discussion.
[8] Or to comprehend when viewed graphically