http response not being sent
Chavdar Botev
cbotev at gmail.com
Fri Nov 5 13:27:49 EDT 2010
Hi!
I am using Netty 3.2.2 Final on a Mac OS X 10.6.4 (10F569).
I am observing a strange issue where an http response is being written
by the server but it is not being sent over the wire.
I have setup LoggingHandlers as the first handlers in the pipeline in
both the server and the client:
//server
pipeline.addLast("netty server traffic",
new LoggingHandler("netty server traffic",
InternalLogLevel.DEBUG, true));
//client
pipeline.addLast("netty client traffic",
new LoggingHandler("netty client traffic",
InternalLogLevel.DEBUG, true));
I see the request being sent
//client
97985 [pull] DEBUG netty client traffic - [id: 0x32771eb1,
/127.0.0.1:51059 => localhost/127.0.0.1:8080] WRITE:
DynamicChannelBuffer(ridx=0, widx=209, cap=256) - (HEXDUMP:
474554202f73747265616d3f736f75726365733d3226636865636b506f696e743d7b2277696e646f774f6666736574223a302c2277696e646f7753636e223a302c22636f6e73756d7074696f6e5f6d6f6465223a224f4e4c494e455f434f4e53554d5054494f4e227d266f75747075743d62696e6172792673697a653d3135303030303020485454502f312e310d0a486f73743a206c6f63616c686f73740d0a436f6e6e656374696f6e3a206b6565702d616c6976650d0a4163636570742d456e636f64696e673a20677a69700d0a0d0a)
[2010-11-05 09:55:41,610] DEBUG [id: 0x32771eb1, /127.0.0.1:51059 =>
localhost/127.0.0.1:8080] WRITTEN_AMOUNT: 209 (netty client traffic)
97985 [New I/O client worker #4-1] DEBUG netty client traffic - [id:
0x32771eb1, /127.0.0.1:51059 => localhost/127.0.0.1:8080]
WRITTEN_AMOUNT: 209
//tcpdump
09:55:41.610514 IP localhost.51059 > localhost.http-alt: Flags [P.],
seq 1634156:1634365, ack 1210854, win 65535, options [nop,nop,TS val
35
794079 ecr 35794079], length 209
0x0000: 4500 0105 0879 4000 4006 0000 7f00 0001 E....y at .@.......
0x0010: 7f00 0001 c773 1f90 7c16 5004 1aa3 ef4a .....s..|.P....J
0x0020: 8018 ffff fef9 0000 0101 080a 0222 2c9f .............",.
0x0030: 0222 2c9f 4745 5420 2f73 7472 6561 6d3f .",.GET./stream?
0x0040: 736f 7572 6365 733d 3226 6368 6563 6b50 sources=2&checkP
0x0050: 6f69 6e74 3d7b 2277 696e 646f 774f 6666 oint={"windowOff
0x0060: 7365 7422 3a30 2c22 7769 6e64 6f77 5363 set":0,"windowSc
0x0070: 6e22 3a30 2c22 636f 6e73 756d 7074 696f n":0,"consumptio
0x0080: 6e5f 6d6f 6465 223a 224f 4e4c 494e 455f n_mode":"ONLINE_
0x0090: 434f 4e53 554d 5054 494f 4e22 7d26 6f75 CONSUMPTION"}&ou
0x00a0: 7470 7574 3d62 696e 6172 7926 7369 7a65 tput=binary&size
0x00b0: 3d31 3530 3030 3030 2048 5454 502f 312e =1500000.HTTP/1.
0x00c0: 310d 0a48 6f73 743a 206c 6f63 616c 686f 1..Host:.localho
0x00d0: 7374 0d0a 436f 6e6e 6563 7469 6f6e 3a20 st..Connection:.
0x00e0: 6b65 6570 2d61 6c69 7665 0d0a 4163 6365
keep-alive..Acce 0x00f0: 7074 2d45 6e63 6f64 696e 673a 2067
7a69 pt-Encoding:.gzi
0x0100: 700d 0a0d 0a p....
//server
[New I/O server worker #1-4] DEBUG netty server traffic - [id:
0x1966c114, /127.0.0.1:51059 => /127.0.0.1:8080] RECEIVED:
BigEndianHeapChannelBuffer(ridx=0, widx=209, cap=209) - (HEXDUMP:
474554202f73747265616d3f736f75726365733d3226636865636b506f696e743d7b2277696e646f774f6666736574223a302c2277696e646f7753636e223a302c22636f6e73756d7074696f6e5f6d6f6465223a224f4e4c494e455f434f4e53554d5054494f4e227d266f75747075743d62696e6172792673697a653d3135303030303020485454502f312e310d0a486f73743a206c6f63616c686f73740d0a436f6e6e656374696f6e3a206b6565702d616c6976650d0a4163636570742d456e636f64696e673a20677a69700d0a0d0a)
[New I/O server worker #1-4] DEBUG netty server traffic - [id:
0x1966c114, /127.0.0.1:51059 => /127.0.0.1:8080] CHANGE_INTEREST: 0
[New I/O server worker #1-4] DEBUG netty server traffic - [id:
0x1966c114, /127.0.0.1:51059 => /127.0.0.1:8080] INTEREST_CHANGED
[pool-1-thread-40] DEBUG netty server traffic - [id: 0x1966c114,
/127.0.0.1:51059 => /127.0.0.1:8080] CHANGE_INTEREST: 1
[pool-1-thread-40] DEBUG netty server traffic - [id: 0x1966c114,
/127.0.0.1:51059 => /127.0.0.1:8080] INTEREST_CHANGED
I see the response being generated by the server and going through the
LoggingHandler
[pool-1-thread-40] DEBUG netty server traffic - [id: 0x1966c114,
/127.0.0.1:51059 => /127.0.0.1:8080] WRITE:
DynamicChannelBuffer(ridx=0, widx=155, cap=256) - (HEXDUMP:
485454502f312e3120323030204f4b0d0a436f6e74656e742d547970653a20746578742f706c61696e3b20636861727365743d5554462d380d0a4163636573732d436f6e74726f6c2d416c6c6f772d4f726967696e3a202a0d0a782d646275732d7265712d69643a20373832330d0a782d646275732d7265712d6c6174656e63793a20300d0a436f6e74656e742d4c656e6774683a20300d0a0d0a)
But tcpdump shows just the ack for the request and no response:
09:55:41.610636 IP localhost.http-alt > localhost.51059: Flags [.],
ack 1634365, win 65535, options [nop,nop,TS val 35794079 ecr
35794079],
length 0
0x0000: 4500 0034 6a27 4000 4006 0000 7f00 0001 E..4j'@. at .......
0x0010: 7f00 0001 1f90 c773 1aa3 ef4a 7c16 50d5 .......s...J|.P.
0x0020: 8010 ffff fe28 0000 0101 080a 0222 2c9f .....(.......",.
0x0030: 0222 2c9f .",.
Here are also thread dumps:
//server
Thread [pool-1-thread-40] (Suspended)
Object.wait(long) line: not available [native method]
DefaultChannelFuture(Object).wait() line: 485 [local variables unavailable]
DefaultChannelFuture.await() line: 186
ChunkedBodyWritableByteChannel.awaitChannelFuture(ChannelFuture) line: 167
ChunkedBodyWritableByteChannel.writeToChannel(Object) line: 153
ChunkedBodyWritableByteChannel.close() line: 83
HttpRequestHandler.writeResponse(MessageEvent) line: 264
HttpRequestHandler.messageReceived(ChannelHandlerContext,
MessageEvent) line: 136
HttpRequestHandler(SimpleChannelUpstreamHandler).handleUpstream(ChannelHandlerContext,
ChannelEvent) line: 80
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext,
ChannelEvent) line: 545
DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(ChannelEvent)
line: 754
ChannelEventRunnable.run() line: 69
OrderedMemoryAwareThreadPoolExecutor$ChildExecutor.run() line: 316
ThreadPoolExecutor$Worker.runTask(Runnable) line: 886
ThreadPoolExecutor$Worker.run() line: 908
Thread.run() line: 637
Thread [New I/O server worker #1-4] (Suspended)
Unsafe.park(boolean, long) line: not available [native method]
LockSupport.park(Object) line: 158
Semaphore$NonfairSync(AbstractQueuedSynchronizer).parkAndCheckInterrupt()
line: 747
Semaphore$NonfairSync(AbstractQueuedSynchronizer).doAcquireShared(int)
line: 877
Semaphore$NonfairSync(AbstractQueuedSynchronizer).acquireShared(int)
line: 1197
Semaphore.acquireUninterruptibly() line: 309
OrderedMemoryAwareThreadPoolExecutor(MemoryAwareThreadPoolExecutor).execute(Runnable)
line: 349
ExecutionHandler.handleUpstream(ChannelHandlerContext, ChannelEvent) line: 146
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext,
ChannelEvent) line: 545
DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(ChannelEvent)
line: 754
Channels.fireMessageReceived(ChannelHandlerContext, Object,
SocketAddress) line: 302
HttpRequestDecoder(ReplayingDecoder<T>).unfoldAndfireMessageReceived(ChannelHandlerContext,
Object, SocketAddress) line: 523
HttpRequestDecoder(ReplayingDecoder<T>).callDecode(ChannelHandlerContext,
Channel, ChannelBuffer, SocketAddress) line: 507
HttpRequestDecoder(ReplayingDecoder<T>).messageReceived(ChannelHandlerContext,
MessageEvent) line: 444
HttpRequestDecoder(SimpleChannelUpstreamHandler).handleUpstream(ChannelHandlerContext,
ChannelEvent) line: 80
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext,
ChannelEvent) line: 545
DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(ChannelEvent)
line: 754
LoggingHandler.handleUpstream(ChannelHandlerContext, ChannelEvent) line: 231
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext,
ChannelEvent) line: 545
DefaultChannelPipeline.sendUpstream(ChannelEvent) line: 540
Channels.fireMessageReceived(Channel, Object, SocketAddress) line: 274
Channels.fireMessageReceived(Channel, Object) line: 261
NioWorker.read(SelectionKey) line: 350
NioWorker.processSelectedKeys(Set<SelectionKey>) line: 281
NioWorker.run() line: 201
ThreadRenamingRunnable.run() line: 108
IoWorkerRunnable.run() line: 46
ThreadPoolExecutor$Worker.runTask(Runnable) line: 886
ThreadPoolExecutor$Worker.run() line: 908
Thread.run() line: 637
Thread [New I/O server boss #1 ([id: 0x1f315415, /0.0.0.0:8080])] (Suspended)
KQueueArrayWrapper.kevent0(int, long, int, long) line: not available
[native method]
KQueueArrayWrapper.poll(long) line: 136
KQueueSelectorImpl.doSelect(long) line: 69
KQueueSelectorImpl(SelectorImpl).lockAndDoSelect(long) line: 69
KQueueSelectorImpl(SelectorImpl).select(long) line: 80
NioServerSocketPipelineSink$Boss.run() line: 241
ThreadRenamingRunnable.run() line: 108
IoWorkerRunnable.run() line: 46
ThreadPoolExecutor$Worker.runTask(Runnable) line: 886
ThreadPoolExecutor$Worker.run() line: 908
Thread.run() line: 637
//client
Thread [New I/O client worker #4-1] (Suspended)
KQueueArrayWrapper.kevent0(int, long, int, long) line: not available
[native method]
KQueueArrayWrapper.poll(long) line: 136
KQueueSelectorImpl.doSelect(long) line: 69
KQueueSelectorImpl(SelectorImpl).lockAndDoSelect(long) line: 69
KQueueSelectorImpl(SelectorImpl).select(long) line: 80
SelectorUtil.select(Selector) line: 38
NioWorker.run() line: 164
ThreadRenamingRunnable.run() line: 108
IoWorkerRunnable.run() line: 46
ThreadPoolExecutor$Worker.runTask(Runnable) line: 886
ThreadPoolExecutor$Worker.run() line: 908
Thread.run() line: 637
Any ideas of what might be happening and/or how I can debug it?
Cheers,
Chavdar
More information about the netty-users
mailing list