Hi all,
I am currently trying to deal with screen orientation and sensors in a
NativeActivity. Some of these issues are not specifically NDK related
but I am interested in solving them from native code (even if I have
to use JNI callbacks).
What I know:
- One need to detect screen rotation on the basis of its natural
orientation to handle properly devices which can be, by deault, in
either landscape or portrait orientation (see
http://android-developers.blogspot.com/2010/09/one-screen-turn-deserves-another.html).
- A good way to convert from natural orientation to screen orientation
is to swap or invert axis as explained in NVidia doc:
http://developer.download.nvidia.com/tegra/docs/tegra_android_accelerometer_v5f.pdf.
- When orientation change of 90°, application is recreated.
Orientation can be detected when application starts
- But when orientation changes of 180°, configuration does not change
and no event is fired
What I would like to know:
- Is there a proper API to perform Display.getOrientation() natively.
Currently, the only solution I see relies on JNI. Through the
AConfiguration API (e.g. AConfiguration_getOrientation), we can only
detect if application is in portrait or landscape but not reversed
portrait or landscape.
- How to detect if a device orientation change of 180° happen apart
from polling regularly the orientation (through JNI :)?).
- Is it possible to fix orientation to Landscape or Portrait (both at
the same time in an) but not reverse Landscape or Reverse Portrait. In
the Android manfest, we can use portrait, landscape but not both at
the same time. My guess is that we have to handle it ourselves with
android:configChanges but configuration does not include "reverse"
information...
My conclusion to all of this is that when dealing with sensors,
orientation should be fixed and handled manually if absolutely
required (although detecting rotation according to natural orientation
is still necessary)... Am I wrong?
Thanks in advance