Hi Mr. Tobi,
thanks a lot for your answer and sorry for the delay, I took some time to make the experiments.
Actually I was doing a mistake with my previous experiment : I was just opening a UDP socket through Python on machine A in order to stream the database file to machine B which was running jAER. The file is send in datagrams by UDP and there are a lot of packets losses in this case because the streaming at the server side is not performed according to events generation time, thus I was not emulate at all the real time generation of events, so actually I was just sending a large file through UDP.
As suggested, I have installed jAER also in the server machine, and I have performed the streaming jAER to jAER .
Yes if I run server/client jAER locally I do not get packet losses, but the scenario that I am considering involves Mobile Edge Computing. So the idea is to have the database on a machine A and to stream in real time the events to a machine B in order to perform some actions (e.g image recognition, tracking, augmented reality) that trigger a feedback, and this seams possible if I use a jAER to jAER streaming approach, so I will perform some tests in order to evaluate packet losses, latency in order to understand which are the network requirements.
I would like to ask if there is somehow a minimum amount, order of magnitude, of events, required to perform a simple object recognition/detection, as I said I am completely new to this field, this can allow me to understand for example which is the minimum amount of events that I have to deliver correctly or the amount of redundancy that I should add to my network so to guarantee the proper functioning.
Thanks a lot