Hello everyone!
We're using pulsar in production for file serving purposes. And there can be some situation when we want to upload to server an extremely large file (e.g. 50 GB or so).
the data gets cached. So, I have 2 questions related to this situation:
1. What happens if we exceed the memory limit? Can pulsar classes related to caching handle the situation when the cache is full correctly or the application will crash?
2. What is the simplest way of disabling cache and reading uploaded file by chunks (like using request's data_and_file method with stream parameter but with cache=off)?
P.S. I have my own fork of pulsar, so I can modify the source in order to acheive goals related to question (2)