We've seen more than one case where a spreadsheet was not the
right choice as it tends to create tons of rules where a set of simple
rules would have done the job. It's worth investigating, but it may be
too late now...
-W
On 15/11/2013, Arul Prashanth <arul.prashanth(a)gmail.com> wrote:
We have an requirement to scale the implemented Drools Rule Engine to
all
states in United States. With respect to the current implementation we have
around 50 decision tables which when packaged creates a 200 MB pkg file and
another pkg around 150 MB. The request which the drools engine consumes is
a
150 KB xml file which is marshalled and processed. We have a JBoss server
configured with 4 GB of memory (heap size + permGen) and works well with
processing 5 concurrent request. Any request beyond 5 leads to
OutOfMemoryException
Now we have to scale this application to process request from 45 states, so
there will be state specific pkg files (350 MB of pkg file). Each pkg is
different for each state.
With all this, do we need to increase the memory as no of states * 4 GB. Is
this assumption correct? Is there a better architecture to handle scaling
of
drools application.
Any memory/performance tuning tricks would also be helpful.
-----
- Prashanth
--
View this message in context:
http://drools.46999.n3.nabble.com/Scaling-Drools-based-application-tp4026...
Sent from the Drools: User forum mailing list archive at
Nabble.com.
_______________________________________________
rules-users mailing list
rules-users(a)lists.jboss.org
https://lists.jboss.org/mailman/listinfo/rules-users