goodnews...@gmail.com
unread,May 6, 2018, 1:01:33 PM5/6/18Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Netty discussions
Ok, I am debugging this further.
It seems like my client sends a data packet to the server.
The server reads it properly.
But when my server sends data back to the client, the client only reads its own packet that it sent to the server previously.
It is only as many bytes as it sent.
So I am sending from client:
123000
When the server sends any packet back to the client, say "A"
The client reads in "1"instead of "A"
If I send"ABCD"
The client reads in "1230"
Is there something I should know about ChannelHandlerContext?
Is it possible for ChannelHandlerContext.write(bytebuffer) to be corrupted by input somehow?
I do not write to the ChannelHandlerContext anywhere else.
The whole system seems like it runs fine except for when I send in one particular place.
If I send the same packet twice, it sends the echo packet just once then resumes sending normally.
So I could just give up, cry anomoly, and send the packet twice and my program would work.
But I want to track this down while I can so it doesn't pop up elsewhere.
What could be causing an echo? An echo that only happens when I .write different data out and only to the length of the data?