I am trying to use webrtc native on beaglebone black running on Android 4.4. I downloaded webrtc code from
https://webrtc.org/native-code/android/ and compiled it for android as per instructions given on this link. Then, I compile APPRTC demo example, this demo example works well on two android mobile phones as it detects mobile phone cameras, when I install this app on Beaglebone black it does not discover any camera (Not even USB web camera). I want to change source of video/audio as IP camera in android app, APPRTC use libjingle_peerconnection.jar file(This file is generated by webrtc from source for android).
I tested APPRTC demo example on a mobile phone, I attached a USB webcam to it thorugh OTG cable,libjingle_peerconnection.jar has a class called as CameraEnumerationAndroid which provides a method
public static int getDeviceCount{
return Camera.getNumberOfCameras();
}
getNumberOfCameras() function calls a native code of Android which only returns count as 2(One for front camera and one for back camera), it does not consider USB webcam as a input source.
Is there any way by which I can always use an IP Camera as video source inside webrtc app? Most of the IP cameras provide RTSP stream as a output. This RTSP Stream I want to use as video source and connect/establish a call using webrtc on my beaglebone Android device without running a separate webrtc server on beaglebone as we have a cloud based webrtc server running already. I have checked Kurento and Janus webrtc gateway servers which can be ported on beaglebone black to read IP camera source and start a webrtc session/call with other clients but I dont want to use a extra server running on beaglebone.
Please help.