Hi everyone
"I am trying to implement a custom Reader that reads dynamically generated, non-geolocated raw image data from an S3 bucket (fsspec) and assigns it a temporary projected area, but I need to use dask.delayed to handle the lazy loading because the files are very large.
What is the best practice for ensuring the area_definition and dataset metadata remain synchronized across multiple MultiScene calls in a distributed dask environment?
Specifically, how do I ensure satpy doesn't try to compute the whole image during Scene instantiation, while still letting pyresample know the coordinates?"