Sorry about the slow reply to your other thread, I can try to answer here.
If I understand the use case: you have a 2000x6000 image that is being generated within Python code and you want to pass that off to Swift to use in an NSImage, but you're currently running into performance and memory issues?
That's a fairly large image to pass around (2000 * 6000 * 4 = 48 MB), and anything that encodes or decodes that might be pretty slow. What is your original source of this image in your Python code? Again, going to and from a PNG, as is done in the above code, is probably going to take a while to do. Further encoding that as a base64 string should make that even slower. You're going to want to find a simpler way of bridging this across.
The ideal solution would be to find a way to memory-map the backing bytes for your image to a raw byte buffer that was also accessible to Swift, so that you can avoid any expensive translation between the two. Unfortunately, I'm not as familiar with how you'd do this on the Python side (I could suggest ways to do this with memory-mapped OpenGL / Metal textures if you were working with that, for example, but that's not the case here). Others might have suggestions for how to approach this, but it might be helpful to know more about the type of image we're working with here and the source within your Python code.