Blaze down in Fair Isaac’s Q1 2012

FICO reported 9% growth in revenues year over year.

  • the bulk of revenues and all the growth was in pre-configured Decision Management applications
  • FICO score revenues were half as much, w/ B2B growing as B2C (myFICO) waned
  • tools revenues were less than half again as much and flat
    • optimization (XPress) was up
    • Blaze Advisor was down

This is in sharp contrast to the success that Ilog has enjoyed under the IBM umbrella.

Blaze Advisor doesn’t seem to make sense as a stand-alone tool any more.   Applications are great, and so are combinations of BI/optimization/rules, but if the BRMS tool will survive independently it needs to find more traction, perhaps outside of Fair Isaac.

Event-centricity driving TIBCO

The call transcript from TIBCO’s Dec 21 review of Q4 results is great reading.  Starting from a simple Rete Algorithm and the insightful acquisition of Spotfire, TIBCO has transformed itself from a technical middleware vendor to a promising enterprise platform.

TIBCO has a long way to go in making its business optimization offerings less technical, but for those that can tolerate less alignment between IT and the business than may be ideal, TIBCO is leading the way in integrating technical agility with business visibility.

It will be tough for Oracle or IBM or SAP to close the gap with what TIBCO has.  Don’t be surprised if rule-based event-driven business processing drives the acquisition of TIBCO by one of these over the next two years.  The growth rate certainly justifies it!  And it won’t stop.

Rules vs. applications of knowledge

I was just asked for some background on business rules and the major players, preferably in the form of videos. The request came in by email, so I didn’t have the opportunity to immediately ask “why”.   Below I give some specific and direct responses, but first a few thoughts about clarifying objectives.

I don’t know of any video that is particularly good from an executive overview standpoint concerning “business rules” or even “decision management” let alone “management of active knowledge”.    My recommendation is to clarify the objective before drilling into “business rules”, which is a technical perspective.  What is it that you are trying to accomplish?  Most abstractly, it could be to manage and improve performance of an activity or an organization.  That kind of answer or focus is the right place to start and then work backwards to the technical approach rather than start with an inadequately conceived technical need.  This is one of the major problems with business rules as an independent market or product line.

Learning from Enterprise Decision Management

While at Fair Isaac, James Taylor saw this clearly.  He articulated the enterprise decision management (EDM) and positioned the business rules capability Fair Isaac acquired with Blaze Software in that space.  That is, more as a strategic objective than as a tool or technology.  This is an example of the proper way to think about business rules.

The decision management perspective was also narrowly focused on point decision making (e.g., using rules) but James and others (e.g., John Lucker of Deloitte) have appropriately expanded the strategy of decision management to include analytics, which produce and inform decision making (i.e., business rules), into a continuous process not of point decision making, but more closed-loop, continuous process improvement.  Over recent years, this has evolved into the broader market of performance management, which also includes performance optimization.

The key thing to consider when considering inquiries about “the applications and market for business rules” is the applications of knowledge.  The “knowledge engineering” community is often too focused on the sources of knowledge.  Focusing on sources rather than opportunities and benefits is a big part of why the business rules market has been subsumed into the business process management market, which is small in comparison to the business intelligence market, the fastest growing segment of which is performance management.

Semantic enterprise performance optimization checklist:

Here’s a checklist to consider when framing your considerations of strategies and tactics that might involve business rules technology:

  1. What knowledge (including policies, regulations, objectives, goals) is involved?
  2. What knowledge is superficial (i.e., derived from or approximations of) versus deeper knowledge?
  3. Will you capture the motivation for a decision rather than how that decision is made using rules?
  4. How will the performance  of your decision management or governance system be evaluated?
  5. Is the knowledge involved in evaluating such performance part of the knowledge that you will capture and management?
  6. How does the manner of evaluation relate to goals and objectives and over what time frames?
  7. Is the knowledge about goals and objectives time frames part of the knowledge to be managed?
  8. Are your decisions rigidly governed in every aspect or do you need the business process to include experimentation and optimization?

Most business rules efforts are focused on contexts so narrow that they are reduced to technical buying criteria without much or any consideration of the above.  That is, most business rule efforts do not even cover point 1 above.  Few reach bullet 2 and only strategic thinkers get to the third.

Specific recommendations for the naive question:

So I went off looking for videos…  You can find some on technical matters involving IBM/Ilog but I didn’t find any good videos from IBM at the business strategy level which concerned knowledge-based process/decision management/governance, which surprised.

A video from the vendors of Visual Rules touches on many of the traditional buying points that IT people typically formulate before evaluating vendors (here).

Although it did not respond to the inquiry, I sent along this video of James’ since it touches on so many of the aspects beyond business rules that are increasingly in vogue, even if it does not go far enough towards things like the business motivation model and the market for performance management, imo.

And for a very thorough background in the form of an analyst presentation that is consistent with all of the above, John Rymer of Forrester is most thorough in the two longer presentations that are here and there.

Please send me any other content that you would recommend!

Paul

Tendencies and purpose matter

The basic formal ontology (BFO) offers a simple, elegant process model.   It adds alethic and teleological semantics to the more procedural models, among which I would include NIST’s process specification language (PSL) along with BPMN.

Although alethic typically refers to necessary vs. possible, it clearly subsumes the probable or expected (albeit excluding deontics0).  For example, consider the notion of ‘disposition’ (shown below as rendered in Protege):

BFO's concept of 'disposition'

For example, cells might be disposed to undergo the cell cycle, which consists of interphase, mitosis, and cytokinesis.  Iron is disposed to rust.  Certain customers might be disposed to comment, complain, or inquire.

Disposition is nice because it reflects things that have an unexpected high probability of occurring1 but that may not be a necessary part of a process.   It seems, however, that disposition is lacking from most business process models.  It is prevalent in the soft and hard sciences, though.  And it is important in medicine.

Disposition is distinct from what should occur or be attempted next in a process.  Just because something is disposed to happen does not mean that it should or will.  Although disposition is clearly related to business events and processes, it seems surprisingly lacking from business models (and CEP/BPM tooling).2

A teleological aspect of BFO is the notion of purpose or intended ‘function’, as shown below:

Function according to the Basic Formal OntologyFunction is about what something is expected to do or what it is for.  For example, what is the function of an actuary?  Representing such functionality of individuals or departments within enterprises may be atypical today, but is clearly relevant to skills-based routing, human resource optimization and business modeling in general.

Understanding disposition and function is clearly relevant to business modeling (including organizational structure), planning and performance optimization.    Without an understanding of disposition, anticipation and foresight will be lacking.  Without an understanding of function, measurement, reporting, and performance improvement will be lacking.


0 SBVR does a nice job with alethic and deontic augmentation of first order logic (i.e., positive and negative necessity, possibility, permission, and preference).

1 Thanks to BG for “politicians are disposed to corruption” which indicates a population that is more likely than a larger population to be involved in certain situations.

2 Cyc’s notion of ‘disposition’ or ‘tendency’ is focused on properties rather than probabilities, as in the following citation from OpenCyc.  Such a notion is similarly lacking from most business models, probably because its utility requires more significant reasoning and business intelligence than is common within enterprises. 

The collection of all the different quantities of dispositional properties; e.g. a particular degree of thermal conductivity. The various specializations of this collection are the collections of all the degrees of a particular dispositional property. For example, ThermalConductivity is a specialization of this collection and its instances are usually denoted with the generic value functions as in (HighAmountFn ThermalConductivity).

Probabilities are Better than Scores

Strategic Analytics slide from Fair Isaac Interact on 2007 mortgage meltdownDuring a panel at Fair Isaac’s Interact conference last week, a banker from Abbey National in the UK suggested that part of the credit crunch was due to the use of the FICO score.  Unlike other panelists, who were former Fair Isaac employees, this gentleman was formerly of Experian!  So there was perhaps some friendly rivalry, but his point was a good one.  He cited an earlier presentation by the founder of Strategic Analytics that touched on the divergence between FICO scores and the probability of default.  The panelist’s key point was that some part of the mortgage crisis could be blamed on credit scores, a point that was first raised in the media last fall.

The FICO score is not a probability. 

Fair Isaac people describe the FICO score as a ranking of creditworthiness.  And banks rely on the FICO score for pricing and qualification for mortgages.  The ratio of the loan to value is also critical, but for any two applicants seeking a loan with the same LTV, the one with the better FICO score is more likely to qualify and receive the better price.

Ideally, a bank’s pricing and qualification criteria would accurately reflect the likelihood of default.  The mortgage crisis demonstrates that their assessment, expressed with the FICO score, was wrong.  Their probabilities were off. Continue reading “Probabilities are Better than Scores”

Super Crunchers: predictive analytics is not enough

Ian Ayres, the author of Super Crunchers, gave a keynote at Fair Isaac’s Interact conference in San Francisco this morning.   He made a number of interesting points related to his thesis that intuitive decision making is doomed.   I found his points on random trials much more interesting, however.

In one of his examples on “The End of Intuition”, a computer program using six variables did a better job of predicting Supreme Court decisions than a team of experts.  He focused on the fact that the program “discovered” that one justice would most likely vote against an appeal if it was labeled a liberal decision.    By discovered we mean that a decision tree for this justice’s vote had a top level decision as to whether the decision was liberal, in which case the program had no further concern for any other information.  Continue reading “Super Crunchers: predictive analytics is not enough”

Adaptive Decision Management

hr-dashboard.jpg

In this article I hope you learn the future of predictive analytics in decision management and how tighter integration between rules and learning are being developed that will  adaptively improve diagnostic capabilities, especially in maximizing profitability and detecting adversarial conduct, such as fraud, money laundering and terrorism.

Business Intelligence

Visualizing business performance is obviously important, but improving business performance is even more important.  A good view of operations, such as this nice dashboard[1], helps management see the forest (and, with good drill-down, some interesting trees). 

With good visualization, management can gain insights into how to improve business processes, but if the view does include a focus on outcomes, improvement in operational decision making will be relatively slow in coming.

Whether or not you use business intelligence software to produce your reports or present dashboards, however, you can improve your operational decision management by applying statistics and other predictive analytic techniques to discover hidden correlations between what you know before a decision and what you learn afterwards to improve your decision making over time.  Continue reading “Adaptive Decision Management”