[rules-users] Size of working memory

soc.frangis soc.frangis at gmail.com
Tue Aug 3 19:45:01 EDT 2010


I am in a very similar situation. We all know that the number of facts + the
way the rule was written / optimized can greatly affect memmory consumption.
But there has to be some number we could throw out where it most situations
that number for heap allocation is feasible. By the way, the reason i am
bringin this up is i am being presented with 'what is the minimum ammount of
ram you can run off of, because it is being cut'

Personally, i am in a situation with a batch of 16k-20k POJO's (1 fact model
type), 200-300 rules, and the rules are just pattern matching the fields of
the java object then setting a boolean within the pojo if a given rule is
fired.

I reviewed the benchmarks at
http://blogs.illation.com.au/category/benchmarks/ and despite the increase
in memory consumption at different numbers of fact models, they all seem to
taper off at some point around
600 MB RAM.  Those benchmarks are written to stress test the Rete algorithm,
but it appears a worst case would leave you with running at 600 MB. 

Is that a fair assumption, or do some of you look at the numbers and feel
there is some kind of f(x) to determine an appx allocation of memory which
should be reserved, based on # inputs and # of 'well written' rules that
just modify in memory facts.
-- 
View this message in context: http://drools-java-rules-engine.46999.n3.nabble.com/Size-of-working-memory-tp1020169p1020834.html
Sent from the Drools - User mailing list archive at Nabble.com.



More information about the rules-users mailing list