Great help , Ed.
I appreciate your reply and again thanks for your valuable time.
Regards,
Ashish Soni
On Thu, Feb 12, 2009 at 12:18 PM, Edson Tirelli <tirelli(a)post.com> wrote:
Ashish,
"If i have 10,000 rules with same priority or equal priority then how
drools engine evaluate them or what is the path it chooses."
LHS side of rules are evaluated at insert time, as you may know, as this
is how the Rete algorithm works. At fire time, the engine uses conflict
resolution strategies to decide which rules to fire. Typically, resolution
strategy is given by the several grouping capabilities that Drools offers
(like ruleflow-groups, agenda-groups, etc) and goes down to saliency
(priority) and finaly a (pseudo) recency.
Although, if you have 2 rules for which all the parameters are the same
(groups, priority, etc), your application should't care which one fires
first, because they are supposed to have the same priority.
"Also if there any performance metrics available ,please point to the
links."
This is the most complicate point. There are quite a few benchmarks out
there, but in the end the performance of your rules is dependent not only on
the engine itself, but also on how a rule is written and how your domain
model looks like. A good analogy is to compare it to a database. Everyone
knows that the quality of the data model, availability of indexes and the
way your query is written is directly proportional to the query performance.
So, the existing benchmarks will try to derive simple cases that allow
comparisons between engines, but NONE of the existing benmarks is capable of
predicting the performance of REAL WORLD applications.
Just to mention one example: I created a simple test a couple weeks ago
for something I was doing. I generated 1000 rules, built a rulebase, and
started feeding facts into the engine and processing them with this 1000
rules. In my laptop, a LeNovo T61, 2ghz, 2Gb Ram, I was able to process an
average of 38000 events/second. What does that tell you? We both know that
it tells you nothing because it depends on the rules I generated and the
optimizations that these rules were triggering (or not) in the engine.
So, the one advice we give to users is: build a POC for YOUR use case.
Take into consideration everything, not only performance. Bottlenecks in
real world applications that use rules engines will be in 99% of the cases
on I/O, specially database I/O, meaning that the most important aspect on a
rules engine for your case will not be performance, but things like rule
management, language expressiveness, time to market, etc.
I know it is not the kind of answer managers look for, but it is the
bare naked truth.
Hope it helps,
Edson
2009/2/12 Ashish Soni <learnspring(a)gmail.com>
Any more response will be appreciated as i am in the final phase of drools
> evaluation and need this urgently.
>
> regards,
> Ashish
>
> On Wed, Feb 11, 2009 at 3:21 PM, Steve Núñez <steve.nunez(a)illation.com.au
> > wrote:
>
>> Ashish,
>>
>> There are a lot of variables involved in performance benchmarking. Your
>> scenario is common in many industries, such as insurance, that utilise large
>> decision tables.
>>
>> We have previously published benchmarks comparing various rules engines
>> at
http://illation.com.au/benchmarks, however they don't specifically
>> address the use case of large rulesets, but instead stress various aspects
>> of the rules engines.
>>
>> We are working to put together a new set of benchmarks that will better
>> test typical use cases, and we'd welcome your comments and participation. We
>> are very early in this process, and should have a publically available 'blog
>> to discuss the topic.
>>
>> Regards,
>> - Steve Nunez
>>
>>
>>
>> On 11/02/09 11:37 AM, "Ashish Soni" <learnspring(a)gmail.com>
wrote:
>>
>> Hi All ,
>>
>> I am not sure if this question is asked previously but it would be great
>> if any one can put some lights on this ..
>>
>> If i have 10,000 rules with same priority or equal priority then how
>> drools engine evaluate them or what is the path it chooses.
>>
>> Also if there any performance metrics available ,please point to the
>> links.
>>
>>
>> Thanks and Regards,
>> Ashish soni
>>
>>
>> _______________________________________________
>> rules-users mailing list
>> rules-users(a)lists.jboss.org
>>
https://lists.jboss.org/mailman/listinfo/rules-users
>>
>>
>
> _______________________________________________
> rules-users mailing list
> rules-users(a)lists.jboss.org
>
https://lists.jboss.org/mailman/listinfo/rules-users
>
>
--
Edson Tirelli
JBoss Drools Core Development
JBoss, a division of Red Hat @
www.jboss.com
_______________________________________________
rules-users mailing list
rules-users(a)lists.jboss.org
https://lists.jboss.org/mailman/listinfo/rules-users