TL;DR: How long does it take for a retina image to be displayed in the screen using OpenFL? Is it possible to use multithreading to load images in iOS?
Long version:
I'm developing in Adobe Air a sort of photo album for a client that involves viewing retina images for iPad (2048 x1536). At first I loaded each image individually. It worked fine. I loaded 1 photo, the user swipes left or right, remove the current photo, and load the next photo.
Now my client insists he wants to see the photos next to each other "like the in the Apple photos app" which forces me to preload all photos of the album because if I do the loading on the fly, it takes too long and the UI animations aren't fluid. Also another problem I haven't been able to solve is that Adobe Scout reports "rendering dirty regions" which causes the app to stutter for a few frames whenever I display a new image (even if that image is already loaded and decoded in memory).
I've made a native extension in Objective C to display large photos, and it works much better than Air at loading and displaying large photos. Problem is that considering the photo viewing is central to the app it would force me to develop a large portion of my code in Objective c, which would take me too long, as I don't know much Objective C nor Apple's SDK.
Another solution would be to use Starling, but on the forums people say the loading of images to the GPU memory takes too long.
So, I'd like to turn to Haxe / OpenFL and see if I can solve this. Before jumping into the pool could someone tell me if you think the performance would improve?
How long does it take to load a retina image to the screen?
Can this be done in a separate thread?
Any help is appreciated!
TIA