Concurrency within ChannelHandlers?

Iain McGinniss iainmcgin at gmail.com
Mon Jul 20 09:43:10 EDT 2009


Hello all,

I am currently trying to put together my own HTTP tunnel. So far, I've  
been writing the client side. What I want to ensure is that there will  
always be at least one request sent to the server end of the tunnel,  
which it can use to stream responses back to me (with a maximum 16KB  
payload per request / response). As sending is happening  
asynchronously to receiving, I don't know what the best approach is in  
Netty when it comes to concurrency in the handlers. The send handler  
may, for instance, choose to halt the transmission of the current  
request so that it becomes available for the server to send messages  
back sooner - this can be achieved by sending less chunks for the  
current request, so that more requests are generated (higher overhead,  
but ensures the server is not blocked waiting for another response).

So in this kind of situation, where the send behaviour is dependent on  
the timing of received messages, how should I orchestrate this? Is  
there a way to get a handle on the thread pool used for the channel,  
and schedule tasks for later execution? Or is there some more elegant  
way of doing things like this in Netty? Ideally, I'd like to avoid  
creating my own pools, to prevent a situation where the number of  
threads in use grows with the number of HTTP tunnels.

Iain


More information about the netty-users mailing list