An existing connection was forcibly closed by the remote host
Luis Neves
luis.neves at gmail.com
Sun Jul 5 09:17:13 EDT 2009
neilson9 wrote:
> Hi,
> Ive been testing netty (CR2 nightly snapshot) for the last few weeks after
> integrating it into our environment. At a stage
> where we experience a burst of traffic - 70-100 machines all establish
> connections to stream results, we end with many of the experiencing
> 'existing connection was forcibly closed by the remote host' (as below) -
> its only during this short burst where the exception is experienced. We also
> have retry logic to throw the existing conection establish a new one - they
> do eventually get through but also experience subsequent exceptions.
>
> The environment is a series of windows machines -the main servers nodes are
> Windows2003Server (Ive upped the Tcpip.sys to 2000) - and while monitoring
> the network we peak at 1095 connections.
>
> I would have thought this number of connections wouldnt be a problem - is
> this a windows OS problem or something I can work around, or tune in Netty?
Hi! Misery loves company. I'm facing the exact same issue while testing
the Netty Http Server on Windows 2003.
I get a bunch of
"java.io.IOException: An established connection was aborted by the
software in your host machine"
and
"java.io.IOException: An existing connection was forcibly closed by the
remote host"
It works flawlessly (with amazing performance) on Linux but on Windows
as the number off connected clients increases the above errors start to
pop up.
What helps somewhat is to increase the socket backlog.
ChannelFactory factory = new
NioServerSocketChannelFactory(Executors.newCachedThreadPool(),
Executors.newCachedThreadPool());
ServerBootstrap bootstrap = new ServerBootstrap(factory);
bootstrap.setOption("backlog", 1024);
<aside>
getting and setting socket options in Netty is not straightforward, it's
not obvious from the API what can be changed, you must know before hand
the names of the properties you want to get/set.
</aside>
This helps but doesn't solve the problem, it only slightly rises the bar
for the number of connected clients... I read somewhere that the maximum
value for the socket backlog on windows is 200.
It may very well be an issue of the JVM network stack on windows but
grizzly and mina don't appear to suffer from it (at least on my initial
testing)
Like you I'm also tunning the Windows tcp parameters but with no luck so
far.
--
/**
* Luis Neves
* @e-mail: luis.neves at co.sapo.pt
* @xmpp: lfs_neves at sapo.pt
* @web: <http://technotes.blogs.sapo.pt/>
* @mobile: +351 962 057 656
*/
More information about the netty-users
mailing list