Geeks With Blogs

News
Charles Young
Tim Bass posted on ‘Orwellian Event Processing’. I was involved in a heated exchange in the comments, and he has more recently published a post entitled ‘Disadvantages of Rule-Based Systems (Part 1)’. Whatever the rights and wrongs of our exchange, it clearly failed to generate any agreement or understanding of our different positions. I don't particularly want to promote further argument of that kind, but I do want to take the opportunity of offering a different perspective on rule-processing and an explanation of my comments.
For me, the ‘red rag’ lay in Tim’s claim that “...rules alone are highly inefficient for most classes of (not simple) problems” and a later paragraph that appears to equate the simplicity of form (‘IF-THEN-ELSE’) with simplicity of function.   It is not the first time Tim has expressed these views and not the first time I have responded to his assertions.   Indeed, Tim has a long history of commenting on the subject of complex event processing (CEP) and, less often, rule processing in ‘robust’ terms, often asserting that very many other people’s opinions on this subject are mistaken.   In turn, I am of the opinion that, certainly in terms of rule processing, which is an area in which I have a specific interest and knowledge, he is often mistaken.
There is no simple answer to the fundamental question ‘what is a rule?’ We use the word in a very fluid fashion in English. Likewise, the term ‘rule processing’, as used widely in IT, is equally difficult to define simplistically. The best way to envisage the term is as a ‘centre of gravity’ within a wider domain. That domain contains many other ‘centres of gravity’, including CEP, statistical analytics, neural networks, natural language processing and so much more. Whole communities tend to gravitate towards and build themselves around some of these centres.
The term 'rule processing' is associated with many different technology types, various software products, different architectural patterns, the functional capability of many applications and services, etc. There is considerable variation amongst these different technologies, techniques and products. Very broadly, a common theme is their ability to manage certain types of processing and problem solving through declarative, or semi-declarative, statements of propositional logic bound to action-based consequences. It is generally important to be able to decouple these statements from other parts of an overall system or architecture so that they can be managed and deployed independently. 
As a centre of gravity, ‘rule processing’ is no island. It exists in the context of a domain of discourse that is, itself, highly interconnected and continuous.   Rule processing does not, for example, exist in splendid isolation to natural language processing.   On the contrary, an on-going theme of rule processing is to find better ways to express rules in natural language and map these to executable forms.   Rule processing does not exist in splendid isolation to CEP.   On the contrary, an event processing agent can reasonably be considered as a rule engine (a theme in ‘Power of Events’ by David Luckham).   Rule processing does not live in splendid isolation to statistical approaches such as Bayesian analytics. On the contrary, rule processing and statistical analytics are highly synergistic.   Rule processing does not even live in splendid isolation to neural networks. For example, significant research has centred on finding ways to translate trained nets into explicit rule sets in order to support forms of validation and facilitate insight into the knowledge stored in those nets.
What about simplicity of form?   Many rule processing technologies do indeed use a very simple form (‘If...Then’, ‘When...Do’, etc.)   However, it is a fundamental mistake to equate simplicity of form with simplicity of function.   It is absolutely mistaken to suggest that simplicity of form is a barrier to the efficient handling of complexity.   There are countless real-world examples which serve to disprove that notion.   Indeed, simplicity of form is often the key to handling complexity.
Does rule processing offer a ‘one size fits all’. No, of course not.   No serious commentator suggests it does.   Does the design and management of large knowledge bases, expressed as rules, become difficult?   Yes, it can do, but that is true of any large knowledge base, regardless of the form in which knowledge is expressed.  
The measure of complexity is not a function of rule set size or rule form.  It tends to be correlated more strongly with the size of the ‘problem space’ (‘search space’) which is something quite different.   Analysis of the problem space and the algorithms we use to search through that space are, of course, the very things we use to derive objective measures of the complexity of a given problem. This is basic computer science and common practice.
Sailing a Dreadnaught through the sea of information technology and lobbing shells at some of the islands we encounter along the way does no one any good.   Building bridges and causeways between islands so that the inhabitants can collaborate in open discourse offers hope of real progress.
Posted on Saturday, March 6, 2010 1:39 PM | Back to top


Comments on this post: Form, function and complexity in rule processing

# re: Form, function and complexity in rule processing
Requesting Gravatar...
To a foreign observer, Tim's public persona is rather childish. Chest-thumping etc. It's easy to get caught up and become emotional. Semantical arguments about rules and expert systems aside, neither one of you really attempted to say what does "inefficient" mean. Inefficient compared to what and how? Tim is latched on to maintenance overhead as a measure of efficiency. You seem to be thinking in terms of performance (I am reading between the lines). Performance, maintenance are but two measures of efficiency, surely there are more...and there's no general definition of efficiency, it's very context-sensitive.

Writing rules in Perl did give me a chuckle though.
Left by Jackson Pollock on Mar 06, 2010 2:50 PM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
Thanks Jackson.

Your comments are insightful :-) I should have learned by this time to keep a cool head when trying to debate these sorts of issue - hey, its only technology, not a matter of life and death. I certainly failed there, especially in my earlier comments on Tim's site. It never pays to get emotional about these things.

In terms of the meaning of the word 'efficiency', I did consciously recognise this problem and tried to steer away from that falling into the 'performance' trap. Like you, I don't believe Tim was talking about performance. I didn't really think primarily about maintenance of existing rule sets, either. I interpreted his comments more in terms of the efficiency with which complex problems can be addressed through the design of rule sets, large or small, and the initial development of knowledgebases. This drove my focus on challenging the assertions about form. Looking back at this post, I can see that, by talking about searches through problem spaces, I am walking dangerously close to the 'performance' trap. The most commonly used measures are, after all, those of time and space complexities which are both closely associated with run-time performance. However, what was uppermost in my mind at the time of writing was that I have seen relatively small, elegant rule sets that conduct sophisticated searches through problem spaces in order to solve complex problems. Thanks for the opportunity to clarify that.
Left by Charles Young on Mar 06, 2010 3:24 PM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
For some reason that blog "discussion" reminded me of the first parts in this classic film [Time Bandits] clip....

http://www.youtube.com/watch?v=JEenKy1S4S0

:)
Left by Paul Vincent on Mar 07, 2010 6:01 AM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
No comment :-)
Left by Charles Young on Mar 07, 2010 6:20 AM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
Hi Charles - Mark Palmer here, CEO of StreamBase. As I think you know, I read and comment just about everywhere on CEP. But I have not read the conversation you mention because frankly, nobody does. I don't know anyone in the CEP industry that even reads Tim's stuff any more. I stopped reading and commenting on him when I finally realized that I've never had a customer, prospect, or analyst say: "I read this cogent point Tim Bass made, would you care to comment?" I've only seen folks scratch their head and conclude that Tim believes nobody else but Tim understands CEP, the products that are in the market are crap, and we're all a bunch of frauds. Meanwhile, we happily go on building a wonderful business, our customers love us, we're growing, and having fun.

So if you're like to take an interesting discourse to a channel that folks listen to and comment on, I'm sure plenty of people will participate on your blog, it's clearly a better use of your time..

- Mark
Left by Mark Palmer on Mar 07, 2010 10:43 AM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
Thanks Mark.

For my part, I do not consider the CEP vendors to be a bunch of frauds, or your products and technologies to be somehow invalid. Nore, I assume, do your customers. I'm very interested in seeing how CEP will develop and evolve over the coming years. One thing is for sure. CEP will evolve according to market forces - if there isn't a market for a given approach, it wont happen. If there is, it will.
Left by Charles Young on Mar 07, 2010 12:14 PM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
Of course, I know you've been very constructive and balanced in your opinions. And I agree, it's a fascinating space and there's lots of innovation to go. Funny, this weekend I was editing our next blog post that will go out on Monday about yet another big name customer who did a bake-off of StreamBase versus traditional programming tools and measured a 50% savings for their event processing applications. We announced the StreamBase Component Exchange and that's been gaining momentum as we collect contributions there. That's the stuff we care about - as you say - the evolution of this market.
Left by Mark Palmer on Mar 07, 2010 12:24 PM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
I'm behind in my blog reading, so pardon me for this late comment.

I am, in part, a statistician. And I can tell you that statisticians (let's call anyone who applies probability to real data, a statistician) have just as many horror stories as any other field. There's nothing magic about statistics, even if it does lately attract a certain aura.

You bring up the point that rule processing is not stand alone and sometimes requires additional techniques like statistics (by which, I include statistical learning and such). I want to add that statistics is never, ever stand alone.

Fundamentally, statistics is just probabilistic interpretation of rules. Any statistician would be utterly puzzled by an attempt to remove rules from statistics.

Also from a very practical perspective, every single real life use of statistics has non-statistical rules all over the place. These rules normalize data, prevent statistical software from making algorithmically correct yet plainly silly assertions and even to improve prediction by using common sense, deterministic decision making. And rules are also always used to interpret and use the results of data processing.

So while rules sometimes require statistics, statistics always always always requires rules.

Also, I will add my own voice to the warnings about interacting with Tim, for reasons already mentioned. My advice is to leave him be.
Left by Hans Gilde on Mar 12, 2010 12:17 PM

# re: Form, function and complexity in rule processing
Requesting Gravatar...
Thanks Hans. A very interesting insight.

To my way of thinking, it always comes back to what you mean by 'rule processing'. I was part of a conversation a few days ago where someone I know with considerable expertise in constraint satisfaction misinterpreted the meaning of another person with a strong background in a well-known rule engine. 'Rule guy' used the term 'rule engine' casually in an all-inclusive sense to mean many different technologies covering many different approaches. 'CS guy' heard something more like 'business rule system' or 'Rete engine', and, in the context of the conversation, reacted strongly.

To my way of thinking, although my knowledge and experience in this area is chiefly around business rule processing and Rete engines, I see rule processing as a very inclusive, wide-ranging term. Indeed, it is difficult to draw precise and hard boundaries. CS involves rules. CSP involves rules. Statistical techniques, as you suggest, involves rules. Expert systems involve rules, etc. I talked about neural nets in this post because, in a sense, they encode learned rules, albeit in an opaque fashion.

My hunch is that the future of 'intelligent systems' lies in working out how to combine and fuse different approaches into forms that are usable and which a wide audience can exploit in order to tackle problems that, today, take considerable human effort. Bringing this stuff together synergistically is a very complex undertaking. There is the more theoretical problem of finding common formalisms that underpin different approaches, and the more practical problem of representing complex, wide ranging capabilities in ways that are comprehensible to the average human. We take baby-steps, only.
Left by Charles Young on Mar 12, 2010 1:31 PM

Your comment:
 (will show your gravatar)


Copyright © Charles Young | Powered by: GeeksWithBlogs.net