FrameDecoder

Trustin Lee (이희승) trustin at gmail.com
Tue Sep 29 04:57:16 EDT 2009


Well, perhaps you are running a client and a server on the same
machine?  Actually, I don't see any unnecessarily opened ports.  If in
doubt, please let me know how to reproduce the problem exactly.

— Trustin Lee, http://gleamynode.net/

On Mon, Sep 28, 2009 at 11:38 AM, sohguanh <sohguanh at yahoo.com.sg> wrote:
>
> Not wanting to flood the forum I tag along my existing topic.
>
> Using Telnet Client and Server Example
>
> I discover some port usage pattern. For a very simple telnet server startup,
> netstat will show 2 ports state established. However when I trigger the
> telnet client, the netstat will show at least 5-6 ports used and state
> established. However once server implement ReadTimeoutHandler or the client
> issue a close connection, all those 5-6 ports with state established changed
> to time_wait almost immediately which is good as those ports can be reused
> by others.
>
> Imagine a lot of client connect to the server then at any one time a lot of
> ports will be used to service them at least until they timeout or issue a
> close connection. Is this Netty design ? Why is a client connection to a
> server require opening so many ports open ? Server startup only uses 2
> ports.
>
>
> --
> View this message in context: http://n2.nabble.com/FrameDecoder-tp3710280p3725805.html
> Sent from the Netty User Group mailing list archive at Nabble.com.
> _______________________________________________
> netty-users mailing list
> netty-users at lists.jboss.org
> https://lists.jboss.org/mailman/listinfo/netty-users
>



More information about the netty-users mailing list