Hi.
I'll start off by saying I'm relatively new to Haxe, HaxeFlixel, ios, and mac... Yet I've been game programming on windows for about 10 years now... so, there's that to set the stage. :)
I made a game for Ludum Dare 31 in HaxeFlixel, and it was generally an awesome experience. I just got a new macbook pro the other day, and so I started working on building on different targets.
Flash, Windows, and Mac targets all went off without a hitch. However, I was getting into some issues with ios. After googling down error messages, I got most, if not all of my libraries to the latest dev versions on Git, and I am compiling to the ios simulator just fine.
My confusion and questions come in with the ios simulator, and the touch input. With no touch controls in the code (meaning, not using FlxTouch or FlxG.Touches anywhere), and solely using mouse (FlxG.mouse) all input works completely as expected in the simulator. With touch code, it does not work. Well, it acknowledges some of them, but they are waaaay off.
The behavior can be observed by running the Multitouch demo in flixel-demos. The touch mapping is way off (though seems to be more accurate in the upper left corner). I also noticed that the FlxButton has touch controls, and when I ran it with FLX_NO_MOUSE, it seemed to kill the buttons. (Which led me to believe it was picking up the mouse input, not the touch)
So my questions are
1) Is this a known issue, and I just haven't googled enough?
2) Is there something going on in the background that translates mouse input to touch input in the ios builds? Or is there something going on with the simulator, that it's picking up the mouse controls since technically, it is a mouse click.
My input controls are super simple, so if the ios builds pick up the FlxG.mouse inputs, it would work fine... it just doesn't seem right to me, if ya know what I mean.
Thanks