so, I have a django REST API Server to save all data stream from my client apps. The Client used some kind python application to send the data to the rest api through this architecture :
- Client apps
- [1.1] MQTT Client used publish mode
- REST API SERVER
- [2.1] Django REST Framework
- [2.2] Celery
- [2.3] MQTT Client used subscribe mode
- MQTT Server
The client send data stream that he produced from IoT Device and send the data to the Server [2. (REST API SERVER)] through the with [1.1.] MQTT Client publish method]. On the REST API Server [2], I have a STANDALONE SCRIPT that live in server [2.3] to subscribe all message from all user from MQTT Server and will save the data to the rest api server db [2.1.]
The question... how do I make this standalone script running with django and process the data stream per user thread or process (some kind of backgroud process that execute parallel based on user thread or process) ?. So if I have 5 user online to send data, all this 5 online can send the data to the rest api without wait from another user complete their process.
How do i do that in celery? is it only execute with delay method? or is it any something best practice that i missed??