Event-centricity driving TIBCO

The call transcript from TIBCO’s Dec 21 review of Q4 results is great reading.  Starting from a simple Rete Algorithm and the insightful acquisition of Spotfire, TIBCO has transformed itself from a technical middleware vendor to a promising enterprise platform.

TIBCO has a long way to go in making its business optimization offerings less technical, but for those that can tolerate less alignment between IT and the business than may be ideal, TIBCO is leading the way in integrating technical agility with business visibility.

It will be tough for Oracle or IBM or SAP to close the gap with what TIBCO has.  Don’t be surprised if rule-based event-driven business processing drives the acquisition of TIBCO by one of these over the next two years.  The growth rate certainly justifies it!  And it won’t stop.

Tendencies and purpose matter

The basic formal ontology (BFO) offers a simple, elegant process model.   It adds alethic and teleological semantics to the more procedural models, among which I would include NIST’s process specification language (PSL) along with BPMN.

Although alethic typically refers to necessary vs. possible, it clearly subsumes the probable or expected (albeit excluding deontics0).  For example, consider the notion of ‘disposition’ (shown below as rendered in Protege):

BFO's concept of 'disposition'

For example, cells might be disposed to undergo the cell cycle, which consists of interphase, mitosis, and cytokinesis.  Iron is disposed to rust.  Certain customers might be disposed to comment, complain, or inquire.

Disposition is nice because it reflects things that have an unexpected high probability of occurring1 but that may not be a necessary part of a process.   It seems, however, that disposition is lacking from most business process models.  It is prevalent in the soft and hard sciences, though.  And it is important in medicine.

Disposition is distinct from what should occur or be attempted next in a process.  Just because something is disposed to happen does not mean that it should or will.  Although disposition is clearly related to business events and processes, it seems surprisingly lacking from business models (and CEP/BPM tooling).2

A teleological aspect of BFO is the notion of purpose or intended ‘function’, as shown below:

Function according to the Basic Formal OntologyFunction is about what something is expected to do or what it is for.  For example, what is the function of an actuary?  Representing such functionality of individuals or departments within enterprises may be atypical today, but is clearly relevant to skills-based routing, human resource optimization and business modeling in general.

Understanding disposition and function is clearly relevant to business modeling (including organizational structure), planning and performance optimization.    Without an understanding of disposition, anticipation and foresight will be lacking.  Without an understanding of function, measurement, reporting, and performance improvement will be lacking.


0 SBVR does a nice job with alethic and deontic augmentation of first order logic (i.e., positive and negative necessity, possibility, permission, and preference).

1 Thanks to BG for “politicians are disposed to corruption” which indicates a population that is more likely than a larger population to be involved in certain situations.

2 Cyc’s notion of ‘disposition’ or ‘tendency’ is focused on properties rather than probabilities, as in the following citation from OpenCyc.  Such a notion is similarly lacking from most business models, probably because its utility requires more significant reasoning and business intelligence than is common within enterprises. 

The collection of all the different quantities of dispositional properties; e.g. a particular degree of thermal conductivity. The various specializations of this collection are the collections of all the degrees of a particular dispositional property. For example, ThermalConductivity is a specialization of this collection and its instances are usually denoted with the generic value functions as in (HighAmountFn ThermalConductivity).

Time for the next generation of knowledge automation

In preparing for my workshop at the Business Rules Forum in Las Vegas on November 5th, I have focused on the following needs in reasoning about processes, about events, and about or over time:

  1. Reasoning at a point within a [business] process
  2. Reasoning about events that occur over time.
  3. Reasoning about a [business] process (as in deciding what comes next)
  4. Reasoning about and across different states (as in planning)

Enterprise decision management (EDM) addresses the first.  Complex event processing (CEP) is concerned with the second.  In theory, EDM could address the third but it does not in practice.  This third item includes  the issue of governing and defining workflow or event-driven business processes rather than point decisions within such business processes. 

Business applications of rules have not advanced to include the fourth item.  That is to say, business has yet to significantly leverage reasoning or problem solving techniques that are common in artificial intelligence.  For example, artificially intelligent question and answer systems, which are being developed for  the semantic web,  can do more than retrieve data – they perform inference.  Commercial database and business intelligence queries are typically much less intelligent, which presents a number of opportunities that I don’t want to go into here but would happy to discuss with interested parties.  The point here is that business does not use reasoning much at all, let alone to search across the potential ramifications of alternative decisions or courses of action before making or taking one.  Think of playing chess or a soccer-playing robot planning how to advance the ball on goal.  Why shouldn’t business strategies or tactical business decisions benefit from a little simulated look-ahead along with a lot of inference and evaluation?

Even though I have recently become more interested in the fourth of these areas, I expect the audience at the business rules forum to be most interested in the first two points above.  There will also be some who have enough experience with complex business processes, which are common in larger enterprises.  These folks will be interested in the third item.  Only the most advanced applications, such as in biochemical process planning, will be interested in the fourth.  I don’t expect many of them to attend!

The notion of enterprise decision management (EDM) is focused on point decision making within a business process.  For enterprises that are concerned with governing business processes, a model of the process itself must be available to the business rules that govern its operation.  I’ve written elsewhere about the need for an ontology of events and processes in order to effectively integrate business process management (BPM) with business rules.  Here, and in the workshop, I intend to get a little more specific about the requirements, what is lacking in current standards and offerings, and what we’re trying to do about it. Continue reading “Time for the next generation of knowledge automation”

Super Crunchers: predictive analytics is not enough

Ian Ayres, the author of Super Crunchers, gave a keynote at Fair Isaac’s Interact conference in San Francisco this morning.   He made a number of interesting points related to his thesis that intuitive decision making is doomed.   I found his points on random trials much more interesting, however.

In one of his examples on “The End of Intuition”, a computer program using six variables did a better job of predicting Supreme Court decisions than a team of experts.  He focused on the fact that the program “discovered” that one justice would most likely vote against an appeal if it was labeled a liberal decision.    By discovered we mean that a decision tree for this justice’s vote had a top level decision as to whether the decision was liberal, in which case the program had no further concern for any other information.  Continue reading “Super Crunchers: predictive analytics is not enough”

Adaptive Decision Management

hr-dashboard.jpg

In this article I hope you learn the future of predictive analytics in decision management and how tighter integration between rules and learning are being developed that will  adaptively improve diagnostic capabilities, especially in maximizing profitability and detecting adversarial conduct, such as fraud, money laundering and terrorism.

Business Intelligence

Visualizing business performance is obviously important, but improving business performance is even more important.  A good view of operations, such as this nice dashboard[1], helps management see the forest (and, with good drill-down, some interesting trees). 

With good visualization, management can gain insights into how to improve business processes, but if the view does include a focus on outcomes, improvement in operational decision making will be relatively slow in coming.

Whether or not you use business intelligence software to produce your reports or present dashboards, however, you can improve your operational decision management by applying statistics and other predictive analytic techniques to discover hidden correlations between what you know before a decision and what you learn afterwards to improve your decision making over time.  Continue reading “Adaptive Decision Management”