I have a litle applicacion that sends and receives data to another
application, using a non-blocking socket. I'm testing it actually and
usually works ok, but when is running for several hours, sometimes
something strange happens with onread event. It´s like TClientSocket
don´t continue firing the event, but server application shows that the
data have been sent. When this happens, any exception isn´t fired and
I can´t detect it... I´ve tried to close and open the socket
periodically but it don´t work, onread events continue failing...
However, leaving the application running several hours more, the
onread event begin to work ok again...
Someone knows why onread events stop work or how can I detect this
state?
Thanks in advance
PD: Sorry for my english :)
Hi,
Yes, as reading sockets is stream, not message based, sender &
receiver can be out of sync and things like
you experience migth happen. But the event fails ......
I've used my socket apps, with a fixed first 6 bytes msg-len
indicator, the client/server always first read 6 bytes
to get the total length of the message, then read..
Also try catch.
Cheers
My app works so like that. First 4 bytes is the size. With them, I
reserve a buffer. That is working without problems, while onread
events happens.
I have tried to capture the onerror event too, but when onread events
stop no error happens... Eureka log doesn't report errors either...
However I am beginning to suspect that problem may be in the server
application (though this app, when is sending, the sendbuf method
always returns the same number of bytes sent), because when the app is
running in several computers at once, connected with the same server,
they usually begins to fail more or less to the same time (but not
always, sometimes one or two of them continue working a little more). I
´ll try to put a sniffer in computers with the client and server to
see if it gives me some clue...
Regards