[rules-users] Integration of Drools with Hadoop
Mark Proctor
mproctor at codehaus.org
Sun Oct 31 01:51:47 EDT 2010
On 31/10/2010 05:35, Roger Smith wrote:
>
> We are working on a project to integrate Drools with Apache Hadoop,
> http://hadoop.apache.org/, and run into some road blocks. We would
> very much appreciate any suggestions/help from this list.
>
> We have an app where we call drools rule engine inside the reducer
> task of a Hadoop map reduce job. It throws a NullPointerException when
> we the rule package resource to the knowledge builder. The same code
> works fine when run as part of a stand alone app.
>
> Code:
>
> private static Map<String, StatefulKnowledgeSession> sessions =
> new HashMap<String, StatefulKnowledgeSession>();
> private static final String RULE_PACK_DIR =
> "file:///home/roger/Projects/gridx/
> <file:///%5C%5Chome%5Cpranab%5CProjects%5Cgridx%5C>";
> private static final String RULE_PACK_EXT = ".drl";
>
>
> public int process(String rulePackage, String dateTime, String
> type) throws TException {
> int rate = 0;
> StatefulKnowledgeSession session = sessions.get(rulePackage);
> if (null == session){
> KnowledgeBuilder kbuilder =
> KnowledgeBuilderFactory.newKnowledgeBuilder();
> String rulePackPath = RULE_PACK_DIR + rulePackage +
> RULE_PACK_EXT;
> kbuilder.add( ResourceFactory.newFileResource(rulePackPath
> ), ResourceType.DRL);
> if ( kbuilder.hasErrors() ) {
> System.err.println( kbuilder.getErrors().toString() );
> }
> KnowledgeBase kbase = KnowledgeBaseFactory.newKnowledgeBase();
> kbase.addKnowledgePackages(kbuilder.getKnowledgePackages());
>
> session = kbase.newStatefulKnowledgeSession();
> sessions.put(rulePackage, session);
> }
>
> ContractRule contractRule = new ContractRule();
> contractRule.prepare(dateTime, type);;
> FactHandle ruleHandle = session.insert(contractRule);
> session.fireAllRules();
> System.out.println("" + contractRule);
> rate = contractRule.getRate();
> session.retract(ruleHandle);
>
> return rate;
> }
>
> This line throws the exception:
> kbuilder.add( ResourceFactory.newFileResource(rulePackPath ),
> ResourceType.DRL);
>
ResourceFactory.newFileResource just creates a file object but cleaning
up the paths.
public FileSystemResource(File file) {
if ( file == null ) {
throw new IllegalArgumentException( "File must not be null" );
}
this.file = new File( StringUtils.cleanPath(file.getPath()) );
}
The code is borrowed from Spring and uses the sme cleanPath logic. Once
the file is created it just creates the stream as so:
public InputStream getInputStream() throws IOException {
this.lastRead = getLastModified();
return new FileInputStream(this.file);
}
So it's pretty basic, the stack trace with line numbers may help. But I
suspect you'll need to do a debug and step into the drools code at the
point where you get the nullpointer and see what's going on.
Mark
>
> It works fine as a stand alone app, outside hadoop
>
> Roger Smith
>
>
>
>
>
> _______________________________________________
> rules-users mailing list
> rules-users at lists.jboss.org
> https://lists.jboss.org/mailman/listinfo/rules-users
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.jboss.org/pipermail/rules-users/attachments/20101031/301617e8/attachment.html
More information about the rules-users
mailing list