[infinispan-dev] Hidden failures in the testsuite

Dan Berindei dan.berindei at gmail.com
Wed Sep 9 06:58:30 EDT 2015


Sanne, the forked JVM that actually runs the tests uses only 1GB by default:

<forkJvmArgs>-Xmx1024m -XX:MaxPermSize=256m</forkJvmArgs>

The CI builds use the same value, although they do enable compressed
oops explicitly and use smaller thread stacks:

env.MAVEN_FORK_OPTS = %maven_opts.memory.x64% %maven_opts.tuning%
maven_opts.memory.x64 = -XX:+UseCompressedOops -Xmx1024m -Xms256m
-XX:MaxPermSize=256m -Xss512k

That being said, you're probably seeing ISPN-5727:
https://github.com/infinispan/infinispan/pull/3696

Cheers
Dan


On Wed, Sep 9, 2015 at 1:33 PM, Sanne Grinovero <sanne at infinispan.org> wrote:
> Hi all,
> sorry for the slow reply, I can't get to test Infinispan very often lately.
>
> On top of previously reported issues - which I still have - today I
> also noticed this one:
>
> [UnitTestTestNGListener] Test
> testPutTimeout(org.infinispan.client.hotrod.ClientSocketReadTimeoutTest)
> failed.
> Sep 09, 2015 11:28:16 AM
> io.netty.util.concurrent.SingleThreadEventExecutor$2 run
> WARNING: Unexpected exception from an event executor:
> java.lang.OutOfMemoryError: GC overhead limit exceeded
>
> Unless surefire overrides it, all my Maven jobs are assigned 2GB of
> heap. I know that's not huge but I prefer it to be conservative to
> serve as "canary".
>
> Is that known to be not enough anymore, or worth looking for memory issues?
>
> Thanks,
> Sanne
>
>
> On 3 September 2015 at 14:41, Galder Zamarreno <galder at redhat.com> wrote:
>> Hi Sanne,
>>
>> I've looked at CDI and Compatibility issues, see below.
>>
>> Cheers,
>> --
>> Galder Zamarreño
>> Infinispan, Red Hat
>>
>> ----- Original Message -----
>>> Hey Sanne! Yep you are right ignoring output is BAD IDEA. I realized that
>>> it's difficult to look through all log manually so probably we should write
>>> some parser in python or bash to grep it and put it into bin/ folder with
>>> other scripts, So at least we can run this script after all tests were run
>>> and analyze it somehow. And about "appear to be good" nobody knows why. It
>>> could be testng/junit issue as we mix it a lot. So this needs further
>>> discussion and analysis.
>>>
>>> Vitalii
>>>
>>> ----- Вихідне повідомлення -----
>>> Від: "Sanne Grinovero" <sanne at infinispan.org>
>>> Кому: "infinispan -Dev List" <infinispan-dev at lists.jboss.org>
>>> Надіслано: Понеділок, 10 Серпень 2015 р 20:46:06
>>> Тема: [infinispan-dev] Hidden failures in the testsuite
>>>
>>> Hi all,
>>> I just updated my local master fork and started the testsuite, as I
>>> sometimes do.
>>>
>>> It's great to see that the build was successful, and no tests
>>> *appeared* to have failed.
>>>
>>> But! lazily scrolling up in the console, I see lots of exceptions
>>> which don't look like intentional (I'm aware that some tests
>>> intentionally create error conditions). Also some tests are extremely
>>> verbose, which might be the reason for nobody noticing these.
>>>
>>> Some examples:
>>>  - org.infinispan.it.compatibility.EmbeddedRestHotRodTest seems to log
>>> TRACE to the console (and probably the whole module)
>>
>> ^ I've run the compatibility testsuite manually and didn't have such issue with master:
>> https://gist.github.com/galderz/b59f1ed4599229022f27
>>
>> Are you still having issues with this?
>>
>>>  - CDI tests such as org.infinispan.cdi.InfinispanExtensionRemote seem
>>> to fail in great number because of some ClassNotFoundException(s)
>>> and/or ResourceLoadingException(s)
>>
>> ^ Hmmmm, not seeing any of that either:
>> https://gist.github.com/galderz/1143078e6be8869cd602
>>
>> Are you still having issues with this?
>>
>>>  - OSGi integration tests seem to be all broken by some invalid
>>> integration with Aries / Geronimo
>>>  - OSGi integration tests dump a lot of unnecessary information to the
>>> build console
>>>  - the Infinispan Query tests log lots of WARN too, around missing
>>> configuration properties and in some cases concerning exceptions; I'm
>>> pretty sure that I had resolved those in the past, seems some
>>> refactorings were done w/o considering the log outputs.
>>>
>>> Please don't ignore the output; if it's too verbose to watch, that
>>> needs to be resolved too.
>>>
>>> I also monitor the "expected execution time" of some modules I'm
>>> interested in, that's been useful in some cases to figure out that
>>> there was some regression.
>>>
>>> One big question: why is it that so many tests "appear to be good" but
>>> are actually broken? I would like to understand that.
>>>
>>> Thanks,
>>> Sanne
>>> _______________________________________________
>>> infinispan-dev mailing list
>>> infinispan-dev at lists.jboss.org
>>> https://lists.jboss.org/mailman/listinfo/infinispan-dev
>>>
>>> _______________________________________________
>>> infinispan-dev mailing list
>>> infinispan-dev at lists.jboss.org
>>> https://lists.jboss.org/mailman/listinfo/infinispan-dev
>>
>> _______________________________________________
>> infinispan-dev mailing list
>> infinispan-dev at lists.jboss.org
>> https://lists.jboss.org/mailman/listinfo/infinispan-dev
>
> _______________________________________________
> infinispan-dev mailing list
> infinispan-dev at lists.jboss.org
> https://lists.jboss.org/mailman/listinfo/infinispan-dev



More information about the infinispan-dev mailing list