Hello,<br>On one of the drools project I worked before (end if 2011), we were using :<br>1) Guvnor as rule authoring and repository. Decision tables are updated automatically from remote via the rest api => 100 000 rules in total<br>
2) The package is built on guvnor => 64 bits JVM Heap of 6 GB and on a virtual environment with 2 core and a standard server => 3-5 minutes to build the package<br>3) the run time are loading the package in binary format => 30 seconds to upload the binary and to build the knowledgebase<br>
4) the project consists of calculating costs during all day, in batches, in real-time, etc.. => 32 bits JVM and in 1.5G of heap, 3 threads are running in parallele and building/removing statefull sessions.<br>We never came into memory issues but when you run a test scenario, it builds the package so if there are one than one user autoring rules and tests scenarios => Take memory. It i why we decided to go to a 64 bits JVM to be able to have a bigger heap.<br>
I hope it helps<br>Nicolas Héron<br><br><div class="gmail_quote">2012/2/6 vadlam [via Drools] <span dir="ltr"><<a href="/user/SendEmail.jtp?type=node&node=3719089&i=0" target="_top" rel="nofollow" link="external">[hidden email]</a>></span><br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
        just wanted to ask one of the questions again.
<br><br>currently, we have around 900 rule assets as shown in Guvnor. some of them are decision tables which mean that the total number of rules might be lot higher ( because each decision table on average has around 20 rows or so)
<br><br>assuming that the number of rules comes to around 1500 or so,
<br><br>can we
<br><br>1. split this one package in Guvnor into 2 packages, build them separately in Guvnor
<br><br>2. and load each of them one after the other into the knowledgebase?
<br><br>will this make any difference to the way memory is consumed during loading the rules into the knowledge base?
<br><br>currently, when loading the single package into the knowledgebase, our windows server is crashing throwing OutOfMemory error.
<br><br>
        
        <br>
        <br>
        <hr color="#cccccc" noshade size="1">
        <div style="color:#444;font:12px tahoma,geneva,helvetica,arial,sans-serif"><div class="im">
                <div style="font-weight:bold">If you reply to this email, your message will be added to the discussion below:</div>
                </div><a href="http://drools.46999.n3.nabble.com/limits-on-number-of-rules-in-Guvnor-in-a-package-loaded-in-knowledge-base-tp3716069p3718794.html" target="_blank" rel="nofollow" link="external">http://drools.46999.n3.nabble.com/limits-on-number-of-rules-in-Guvnor-in-a-package-loaded-in-knowledge-base-tp3716069p3718794.html</a>
        </div><div class="HOEnZb"><div class="h5">
        <div style="color:#666;font:11px tahoma,geneva,helvetica,arial,sans-serif;margin-top:.4em;line-height:1.5em">
                
                To unsubscribe from Drools, <a href="" target="_blank" rel="nofollow" link="external">click here</a>.<br>
                <a href="http://drools.46999.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml" rel="nofollow" style="font:9px serif" target="_blank" link="external">NAML</a>
        </div></div></div></blockquote></div><br>
        <div class="signature weak-color">Nicolas Héron</div>
<br/><hr align="left" width="300" />
View this message in context: <a href="http://drools.46999.n3.nabble.com/limits-on-number-of-rules-in-Guvnor-in-a-package-loaded-in-knowledge-base-tp3716069p3719089.html">Re: limits on number of rules in Guvnor in a package, loaded in knowledge base</a><br/>
Sent from the <a href="http://drools.46999.n3.nabble.com/Drools-User-forum-f47000.html">Drools: User forum mailing list archive</a> at Nabble.com.<br/>