however the size of the array goes quite big (around 10GB for 250 images and corresponding dense labels), does any one have experience with saving images data with a reduced size?
a) It may sound silly, but image compression is a well-studied problem -- if you need to reduce storage space (or I/O bandwidth), use PNG, or JPEG/JPEG2000 if you can tolerate small differences. The PIL module allows you to compress/decompress in memory. If you prepare mini-batches in a separate thread, decompression should not slow you down by much.
b) If you're worried about keeping everything in memory, note that you can load numpy arrays (in .npy format) as memory-mapped files: np.load(fn, mmap_mode='r'). This will offload the burden of loading from disk and/or caching to the operating system. To write a .npy file that is too large to fit into main memory, open it with np.lib.format.open_memmap(fn, mode='w', dtype=..., shape=...).
Hope this helps!
Best, Jan