I am running into an issue with runing multiple clustered instances of Jboss. These are on
seperate servers (3 to be exact)
Here is what I am trying to do.
1. Route all multicast via a private network
route add -net 224.0.0.0 netmask 240.0.0.0 dev bond1
2. make a copy of the default server and name it
foo.bar.com
3. Use custom start and stop scripts for each instance / site
su -l jboss -c '/apps/jboss/bin/run.sh -c
foo.bar.com -b 172.25.1.52
-Djboss.partition.name=foo.bar.com --udp=224.10.10.10 > /dev/null 2> /dev/null
&'
This should work. I tested that the default server will start and clustering works (needed
to set the bind_addr for multicast).
However when I start the servers they do not see each other as cluster members. I have
done a tcpdump and do see traffic from both nodes over bond1
I am curious if there is anyhting special I need to do... did I find a bug..
Any help would be greatly appreciated.
View the original post :
http://www.jboss.com/index.html?module=bb&op=viewtopic&p=4020245#...
Reply to the post :
http://www.jboss.com/index.html?module=bb&op=posting&mode=reply&a...