Hello,
optimalization question for system that load 100+ xml files and each of them chunk the XML and send partial XML to rabbitMQ queue.
When now Im dealing with 1 000+ messages (100 000+ in the future) and now I got some capacity issues Im thinking that will be the best optimalization.
Each of the message contains XML with some port of products. Then this message is consumed and PHP script is executed to load them.
But because this XML can be large Im thinking of storing those XML-s in files and pass only a hash of a file as a message. And then in the consumer load this file by hash and then pass this xml from that file to my PHP script.
Does it help with the memory limit ? or is it just about limit o messages - not size of messages ?
Or maybe I should just ignore previous ide and use the Quorum Queues which automatically stores messages in files to save some memory ?
Or there is better idea for this ?