JBoss Community

way too many SLSB instances eventually cause heap to run out

created by Ian Springer in EJB3 - View the full discussion

In the RHQ 3.0 server, we use JBossAS 4.2.3's bundled EJB3. A user has reported seeing way too many SLSB instances slowly building up in heap and eventually causing the server to run out of heap (see his post from the rhq-devel list below). Can anyone shed some light on what could be causing it? Is it a bug in the EJB container, a bug in our application code, or just some configuration setting we need to adjust?

 

Thanks,

Ian

 

-----

 

Following up a post from about a month ago.  We were seeing a    persistent slow memory leak in the rhq server in tenured gen space    that eventually led to an out of memory exception after running the    server for about a week.  I captured a heap dump and found hundreds    of thousands of stateless session beans in memory.  Here's a    snapshot from my profiler of a table of classes with greatest number    of instances. 

 

 

 

                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       
NameObjectsShallow SizeRetained Size
java.util.HashMap$Entry193975593108240189082696
java.util.HashMap$Entry[]1090957167796768340273520
java.util.HashMap108426569392960408521632
java.util.LinkedList$Entry86096534438600727956072
org.jboss.ejb3.BaseSessionContext8562813425124034251240
org.rhq.enterprise.server.authz.RequiredPermissionsInterceptor8562811370049613700496
org.rhq.enterprise.server.common.TransactionInterruptInterceptor8562811370049613700496
org.jboss.ejb3.stateless.StatelessBeanContext85626568501200490959040
java.lang.String4290251716100048902064
char[]3794543789787237897872
java.lang.Integer17163341191924119192
java.util.Hashtable$Entry157623756590434980432
java.util.TreeMap$Entry105496675174414950816
java.lang.String[]9840143404806555536
org.rhq.enterprise.server.auth.SubjectManagerBean91116656035249567104
org.rhq.enterprise.server.auth.TemporarySessionPasswordGenerator91116364464043006752
org.rhq.enterprise.server.authz.AuthorizationManagerBean9111521867602186760
org.rhq.enterprise.server.alert.AlertConditionManagerBean9108429146882914688
org.rhq.enterprise.server.alert.AlertManagerBean9091494550569455056
org.rhq.enterprise.server.alert.AlertDefinitionManagerBean9091143637284363728
org.rhq.enterprise.server.alert.AlertConditionLogManagerBean9090350905685090568
org.rhq.enterprise.server.alert.CachedConditionManagerBean9090343633444363344
org.rhq.enterprise.server.alert.AlertDampeningManagerBean9090336361203636120
org.jboss.security.SecurityAssociation$SubjectContext4922923629922362992
org.rhq.enterprise.server.cloud.instance.ServerManagerBean3935434631523463152
org.rhq.enterprise.server.cloud.CloudManagerBean3935428334882833488

 

    Here are the merged paths from the SubjectManagerBean to GCRoot:

 

 

                                                                                                                                                                                                                               
<All the objects>
org.jboss.ejb3.stateless.StatelessBeanContext
java.util.LinkedList$Entry
java.util.LinkedList$Entry
java.util.LinkedList
org.jboss.ejb3.InfinitePool
org.jboss.ejb3.ThreadlocalPool
org.jboss.ejb3.stateless.StatelessContainer

 

    All the other manager beans have similar merged paths.  So I started    to wonder why there were so many slsb's in the ThreadlocalPools and    after some digging found this    (http://community.jboss.org/message/363520) thread that sort of    describes what I'm seeing.  I still don't know why it's happening    but it gave me something to try.  I changed the Stateless Bean pool    class in ejb3-interceptors-aop.xml from ThreadlocalPool to    StrictMaxPool.  Now when I run the server and watch it with my    profiler I see at max 3 SubjectManagerBeans in memory.  Same appears    to be true for other slsb's.  This isn't a solution to the problem    but I'm hoping someone can shed light on what's really going on.  I    would be happy to upload the heap dump to somewhere public but it's    almost a GB in size. 

 

    Bala Nair

    SeaChange International

 

Reply to this message by going to Community

Start a new discussion in EJB3 at Community