Too many open files error with Netty

Trustin Lee (이희승) trustin at gmail.com
Tue Oct 27 07:40:43 EDT 2009


On Tue, Oct 27, 2009 at 7:42 PM, Trustin Lee (이희승) <trustin at gmail.com> wrote:
> On Tue, Oct 27, 2009 at 7:37 PM, Albert Strasheim <fullung at gmail.com> wrote:
>> Hello,
>>
>> 2009/10/27 Trustin Lee (이희승) <trustin at gmail.com>:
>>> Sounds like you are not getting the 'too many open files' error
>>> anymore.  Could you confirm it?
>>
>> That is correct.
>
> Thanks for confirmation. :)
>
>>> Do you get OutOfMemoryError in the end or is it just working fine?  I
>>> do see saw-toothed patterns in the memory graph, but I never got an
>>> exception so far.  If the connection rate is high, GC will be busy,
>>> but I'm not sure it will slow down the application seriously.
>>> Additional information is appreciated.
>>
>> When the maximum number of 10000 connections is active, about a
>> hundred sessions are completing per second. But when it reaches about
>> 25000 connections in total, I don't get OutOfMemoryError, but the
>> application basically stops and just spends all its time GCing.
>
> That is strange.  What happens if you increase the heap size to 512M
> then?  Would you mind if you send me the heap dump file?  I will take
> a look using my profiler.

I ran it by myself with a large 'maxClients' value and succeeded to
reproduce huge memory consumption.  According to the profiler, it's
due to the pending write requests in the queue.  Simple calculation
(30 * 20 * 32 * 10000) says that a lot of heap is required to handle
this amount of simultaneous I/O.  Perhaps you should reduce the
traffic?

-- Trustin Lee, http://gleamynode.net/



More information about the netty-users mailing list