Open Both Front And Back Cameras Simultaneously

557 views
Skip to first unread message

Andrew Villagran

unread,
Apr 14, 2020, 11:22:52 AM4/14/20
to Android CameraX Discussion Group

Hi, I'm currently trying to replace my company's Camera2 implementation with CameraX (camera-core:1.0.0-beta02, camera-lifecycle:1.0.0-beta02, camera-camera2:1.0.0-beta02, camera-view:1.0.0-alpha09).  We currently use Camera2 for access control using face detection.

(1) One of our requirements is that we be able to open both the front and back cameras simultaneously.  On the devices that we load Android onto, both cameras physically face the front (even though in software we access the first as LENS_FACING_FRONT and the second as LENS_FACING_BACK).  The first camera is a normal color camera, but the second is an infrared camera.  We use the color camera for displaying the user's image to the screen, but we silently pass the infrared camera's preview frames to the face detector.  We feel that using an infrared camera for face detection is a good choice because it's impervious to lighting conditions, whether is be sunny or completely dark.  We are currently able to achieve this using the Camera2 library, but not with CameraX.  Here's what I tried:


// Camera selector.
final CameraSelector colorCameraSelector = new CameraSelector.Builder().requireLensFacing(CameraSelector.LENS_FACING_FRONT).build();

final CameraSelector infraredCameraSelector = new CameraSelector.Builder().requireLensFacing(CameraSelector.LENS_FACING_BACK).build();

// Preview.
final Preview colorPreview = new Preview.Builder().setTargetResolution(targetResolution).setTargetRotation(targetRotation).build();

// Image analysis.
final ImageAnalysis imageAnalysis = new ImageAnalysis.Builder().setTargetResolution(targetResolution).setTargetRotation(targetRotation).setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST).build();

imageAnalysis.setAnalyzer(executorService, new ImageAnalyzer());

// Must unbind the use-cases before rebinding them.
processCameraProvider.unbindAll();

final Camera colorCamera = processCameraProvider.bindToLifecycle(lifeCycleOwner1, colorCameraSelector, colorPreview);

final Camera infraredCamera = processCameraProvider.bindToLifecycle(lifeCycleOwner2, infraredCameraSelector, imageAnalysis);

// Attach the preview view's surface provider to preview use case.
colorPreview.setSurfaceProvider(colorPreviewView.createSurfaceProvider(colorCamera.getCameraInfo()));


From my tests, only the second call to bindToLifeCycle() will actually stick. Are you able to add support for this feature?

(2) Is it possible to start and stop a Preview or an ImageAnalysis without binding it to a LifecycleOwner?  You see, our devices have proximity sensors and when we detect a user's presence, we turn on the cameras and show the preview images, and when the user walks away we turn off the cameras.  So we need to be able to start and stop the cameras independent of the normal lifecycle callbacks.  Currently, the hack I introduced to make this work is to create a fake class that implements LifecycleOwner and manually call lifecycleRegistry.setCurrentState() to the state I need to make the cameras turn on and off.

(3) When building Preview and ImageAnalysis objects, I've found that it's better for me to use setTargetResolution() instead of setAspectRatio().  The reason why is because when we try to draw an oval around the user's face in a graphic overlay, it's difficult figuring out how to transform the coordinates so that the oval isn't stretched oddly.  I've been using this template (https://github.com/googlesamples/android-vision/blob/master/visionSamples/multi-tracker/app/src/main/java/com/google/android/gms/samples/vision/face/multitracker/ui/camera/GraphicOverlay.java) as a basis for our graphic overlay, but I'm not sure how to tweak it when using setAspectRatio() instead of setTargetResolution().  Any help?

Finally, thanks so much for creating this library.  It's helping reduce ~1000 lines of code from our Camera2 implementation to ~30 lines with CameraX.

Vinit Modi

unread,
Apr 14, 2020, 12:49:24 PM4/14/20
to Andrew Villagran, Android CameraX Discussion Group
Hi Andrew,

Thanks for trying CameraX and reaching out! :) I will comment on (1) simultaneous camera and defer to the team for (2) and (3):
  • Majority of Android phones do not have the underlying drivers support in the hardware abstraction layer (HAL) to support opening two cameras simultaneously
    • We are working with manufacturers to add this support
    • There is now an official API in Android R for simultaneous front and back cameras which will let you check if a device supports that functionality or not 
    • Once this functionality is in camera2 we can look to add that into CameraX 
  • Regarding using a color and infrared camera together, did I understand they are facing the same direction? One concern is that most Android phones do not have an IR and color camera facing the same direction and functionality built may only work on a very small set of devices
Thanks!
Vinit

--
You received this message because you are subscribed to the Google Groups "Android CameraX Discussion Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to camerax-develop...@android.com.
To view this discussion on the web visit https://groups.google.com/a/android.com/d/msgid/camerax-developers/e7ccca3e-2145-47c9-a2e8-0ee48d859895%40android.com.

Andrew Villagran

unread,
Apr 14, 2020, 1:15:27 PM4/14/20
to Android CameraX Discussion Group, andrewv...@gmail.com
Hi Vinit,

Thanks for the response!  The Camera2 library may not currently have a way of detecting if a device supports simultaneous cameras, but it definitely doesn't stop the developer from opening both cameras if they try.  Right now I have a working Camera2 application that opens both cameras simultaneously (we preload our application onto custom hardware that is guaranteed to support both cameras --  we don't distribute our app via Google Play).  Using Camera2, I have one TextureView for the color camera and another TextureView for the infrared camera.  When onSurfaceTextureAvailable() gets called on each one I start the color camera and infrared camera, respectively.  When each camera's onOpened() gets called I call createCaptureSession() on both, but for the infrared camera I additionally call setOnImageAvailableListener() so that I can get preview frames that I pass to our frame processing Runnable.  In our xml layout, I have the color camera's TextureView fill the screen, but I make the infrared camera's TextureView 1px by 1px large so that it's effectively invisible (I couldn't get the capture session to work without the TextureView being visible).

Andy

Franklin Wu

unread,
Apr 14, 2020, 6:25:47 PM4/14/20
to Android CameraX Discussion Group, andrewv...@gmail.com
HI Andrew,
Here are answers to the other questions
From my tests, only the second call to bindToLifeCycle() will actually stick. Are you able to add support for this feature?

It is explicitly part of the API that only the last LifecycleOwer that was bound using bindToLifecycle() will be active. Is there a reason that you are binding to 2 different LifecycleOwners? If you can, I would suggest using the same LifecycleOwner. I don't know when CameraX will allow binding of 2 LifecycleOwners, but CameraX can not change the API definition of existing APIs unless it moves to a 2.0.0 release. It'll need to be a feature request that we need to figure out how to incoporate.


(2) Is it possible to start and stop a Preview or an ImageAnalysis without binding it to a LifecycleOwner? You see, our devices have proximity sensors and when we detect a user's presence, we turn on the cameras and show the preview images, and when the user walks away we turn off the cameras. So we need to be able to start and stop the cameras independent of the normal lifecycle callbacks. Currently, the hack I introduced to make this work is to create a fake class that implements LifecycleOwner and manually call lifecycleRegistry.setCurrentState() to the state I need to make the cameras turn on and off.

As of right now there is no way to start a Preview/ImageAnalysis without binding it to a LifecycleOwner. However, it is possible to stop streaming frames.
For ImageAnalysis, you can clear the analyzer with clearAnalyzer()
For Preview, if you are using a SurfaceTexture you can call setSurfaceProvider() will null which will remove the current SurfaceProvider.
This will stop frames from streaming (i.e. stop repeating requests for both those Surfaces).

Unfortunately there is no way to close a CameraDevice completely without unbinding the UseCases
 

(3) When building Preview and ImageAnalysis objects, I've found that it's better for me to use setTargetResolution() instead of setAspectRatio(). The reason why is because when we try to draw an oval around the user's face in a graphic overlay, it's difficult figuring out how to transform the coordinates so that the oval isn't stretched oddly. I've been using this template (https://github.com/googlesamples/android-vision/blob/master/visionSamples/multi-tracker/app/src/main/java/com/google/android/gms/samples/vision/face/multitracker/ui/camera/GraphicOverlay.java) as a basis for our graphic overlay, but I'm not sure how to tweak it when using setAspectRatio() instead of setTargetResolution(). Any help?
Can you clarify the issues that you are running into when using setAspectRatio? The calculations should be the same whether or not you are using setTargetResolution() or setAspectRatio(). You'll be able to get the actual resolutions of the ImageAnalysis by examining the ImageProxy in the Analyzer and for Preview you can get it from the SurfaceRequest of the SurfaceProvider (if you are using setSurfaceProvider).

Andrew Villagran

unread,
Apr 22, 2020, 4:47:21 PM4/22/20
to Android CameraX Discussion Group, andrewv...@gmail.com
Hi Franklin,

Is there a reason that you are binding to 2 different LifecycleOwners?

I found this in ProcessCameraProvider.java:

* <p>If different use cases are bound to different camera selectors that resolve to distinct
* cameras, but the same lifecycle, only one of the cameras will operate at a time. The
* non-operating camera will not become active until it is the only camera with use cases bound.

I guess my rationale was that I was hoping that if different use cases were bound to different camera selectors that resolved to distinct cameras, but used different lifecycles,  then both cameras could operate at the same time.

I understand that at the moment only a small subset of devices can support opening 2 cameras at once, but is it possible for CameraX to not actively stop developers who attempt to open them both at once?  Is this a difficult feature to implement?  Can you show me where I can make an official feature request?

So far I'm loving CameraX sooo much; I'm so motivated to replace our Camera2 implementation with CameraX!

Andy


Franklin Wu

unread,
Apr 23, 2020, 3:29:18 PM4/23/20
to Android CameraX Discussion Group, andrewv...@gmail.com
Hi Andy,
   We are starting with a more conservative subset of features being made available, because one of the more difficult things working with camera2 is know what combination of streams can be configured successfully (which includes multiple cameras).

   It's definitely desirable to allow multiple cameras. The difficultly is not opening up to allow for the behavior, but how do we do so in a manner that doesn't easily lead to setting an invalid configuration. 

   You can file a feature request here. If you have ideas on how you'd want to access it in a safe manner (e.g. how to determine a device supports multiple cameras, being able to use multi lifecycle, etc) that'd be great since it'll better help us know how and what to build.

   Really awesome to hear that you are loving CameraX .

Дмитрий Никишов

unread,
Jul 9, 2020, 11:58:07 AM7/9/20
to Android CameraX Discussion Group, nilkn...@google.com, andrewv...@gmail.com
Hi Andy,
  Did you found the workaround to use both cameras simultaneously? I faced with the same issue and according to cameraX team answer I see the only way to use both cameras:
- camera1 api for ir camera with less logic
- cameraX api for rgb camera.
I tested this solution with my device and it's working. Do you see any disatvantages to use two camera api?
  
Dmitry
четверг, 23 апреля 2020 г. в 22:29:18 UTC+3, nilkn...@google.com:

Tomáš Válek

unread,
Nov 8, 2020, 2:26:47 PM11/8/20
to Android CameraX Discussion Group, lofe...@gmail.com, nilkn...@google.com, andrewv...@gmail.com
Hello all,

after 3 months of the last answer, did you found the workaround to use both cameras simultaneously only in CameraX?

Tom.

Dne čtvrtek 9. července 2020 v 17:58:07 UTC+2 uživatel lofe...@gmail.com napsal:
Reply all
Reply to author
Forward
0 new messages