FrameDecoder

sohguanh sohguanh at yahoo.com.sg
Sun Sep 27 22:38:47 EDT 2009


Not wanting to flood the forum I tag along my existing topic. 

Using Telnet Client and Server Example

I discover some port usage pattern. For a very simple telnet server startup,
netstat will show 2 ports state established. However when I trigger the
telnet client, the netstat will show at least 5-6 ports used and state
established. However once server implement ReadTimeoutHandler or the client
issue a close connection, all those 5-6 ports with state established changed
to time_wait almost immediately which is good as those ports can be reused
by others.

Imagine a lot of client connect to the server then at any one time a lot of
ports will be used to service them at least until they timeout or issue a
close connection. Is this Netty design ? Why is a client connection to a
server require opening so many ports open ? Server startup only uses 2
ports.


-- 
View this message in context: http://n2.nabble.com/FrameDecoder-tp3710280p3725805.html
Sent from the Netty User Group mailing list archive at Nabble.com.


More information about the netty-users mailing list