I am
benchmarking Drools 4.0 and received excellent results in rule execution.
With our non-optimized test rule set and data, Drools out performed
ILOG/JRules by quite a bit. Congratulations to Drools' team for a job well
done.
I
encountered a strange behavior when trying to test the performance of rule
registration, meaning to add Package's to a new instance of
RuleBase. I ran the same test 5 times in a sequence, and the RuleBase
instance was discarded right away. The test ran with different different
number of packages, 20, 50, 100, and 200. The results are as the
followings:
Added
20 packages to rulebase in 0:00:0.15
Added 20 packages to rulebase in
0:00:0.21
Added 20 packages to rulebase in 0:00:0.21
Added 20 packages to
rulebase in 0:00:0.301
Added 20 packages to rulebase in 0:00:0.18
Added 20
packages to rulebase 5 times in 0:00:1.051
Added
50 packages to rulebase in 0:00:0.17
Added 50 packages to rulebase in
0:00:1.422
Added 50 packages to rulebase in 0:00:1.252
Added 50 packages
to rulebase in 0:00:1.322
Added 50 packages to rulebase in
0:00:1.331
Added 50 packages to rulebase 5 times in
0:00:5.497
Added
100 packages to rulebase in 0:00:0.19
Added 100 packages to rulebase in
0:00:5.278
Added 100 packages to rulebase in 0:00:5.297
Added 100 packages
to rulebase in 0:00:5.228
Added 100 packages to rulebase in
0:00:5.327
Added 100 packages to rulebase 5 times in
0:00:21.32
Added
200 packages to rulebase in 0:00:0.22
Added 200 packages to rulebase in
0:00:21.691
Added 200 packages to rulebase in 0:00:21.481
Added 200
packages to rulebase in 0:00:21.642
Added 200 packages to rulebase in
0:00:21.341
Added 200 packages to rulebase 5 times in
0:01:26.385
As you
can see, the timing of creating first RuleBase instance is always good, while
the subsequent one's were very bad. Is there an attribute I could adjust
in configuration to improve the performance for this scenario? Please
advise if there is other alternatives. I would be glad to
provide details if needed.
Your
help is greatly appreciated!
-Ming