[rules-users] Rule processing for High volume input

techy techluver007 at gmail.com
Thu Nov 6 16:41:41 EST 2008


I have to run rules(1000 rules) against 20-30 million of record each day.
I guess I would have memory issue If I insert all of records into working
memory.

so I'm thinking to have multiple rule base instance and do batch processing
as given below.so that I can avoid memory issue and quick rule processing


a.Take  first 1 million records and distribute 1 millon across working
memory of 5 rule base.
b.fire rules in all 5 rule base
c.clear working memory of all rule base
d. take next 1 million record and start from step-a till I process all of
the records.


Please advise whether this is possible/recommended way or not.

thanks
-- 
View this message in context: http://www.nabble.com/Rule-processing-for-High-volume-input-tp20370272p20370272.html
Sent from the drools - user mailing list archive at Nabble.com.




More information about the rules-users mailing list