[rules-users] Working memory batch insert performance

Zhuo Li milanello1998 at gmail.com
Tue Dec 20 08:09:23 EST 2011


Hi, folks,

 

I recently did a benchmark on Drools 5.1.2 and noticed that data insert into
a stateful session is very time consuming. It took me about 30 minutes to
insert 10,000 data rows on a 512M heapsize JVM. Hence I have to keep
inserting data rows when I receive them and keep them in working memory,
rather than loading them in a batch at a given time. This is not a friendly
way for disaster recovery and I have two questions here to see if anybody
has any thoughts:

 

1.       Is there any better way to improve the performance of data insert
into a stateful session;

2.       I noticed that there is a method called BatchExecution() for a
stateless session. Did not get a chance to test it yet but is this a better
way to load data in a batch and then run rules?

 

My requirement is I need to load a batch of data once by end of the day, and
then run the rules to filter out matched data against unmatched data. I have
a 3-hour processing window to complete this loading and matching process,
and the data I need to load is about 1 million to 2 millions. My JVM
heapsize can be set up to 1024 M.

 

Best regards

Abe

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.jboss.org/pipermail/rules-users/attachments/20111220/7e6b84be/attachment.html 


More information about the rules-users mailing list