My current setup:
- OpenCV2.3.1-android-arm
- ffmpeg-0.7.5-android-arm
- HTC Evo 3D running a 2.3.4 (rooted, but just a basic ROM to remove
CarrierIQ)
- Eclipse and the Android SDK, NDK, etc are all the latest versions.
My goal:
- Capture an image somehow from OpenCV that I can turn into an
AVFrame and send to a custom video server.
Methods I have tried to get an image:
- CvCapture cap*
- FFMpegFrameGrabber
- OpenCVFrameGrabber
*Used this to get a CvCapture =
opencv_highgui.cvCreateCameraCapture(opencv_highgui.CV_CAP_ANDROID)
Also tried a lot of other device numbers just to see, none worked.
Other notes:
- I am very flexible about which codecs I use, but would prefer H.264
over H.263.
- Preferred resolution is CIF (352x288), but I would settle for QCIF
if the processing overhead is too much at the higher resolution.
- To send my video, all I need is an encoded video frame as a byte[].
Thanks - Charles
Since Android has its own Java API for camera capture, why not simply
use that? I understand that the OpenCV guys added some hack to highgui
to capture from the native side, but if you're going with Java anyway,
I'd use the official API...
Samuel
Samuel
> opencv_highgui.__cvCreateCameraCapture(opencv___highgui.CV_CAP_ANDROID)
How did you accomplish this? Might be useful for others to know, thanks
> able to have different codecs available from ffmpeg. Is there a way
> to use the ffmpeg-android*.jar to access custom compiled ffmpeg
> libraries (.so)?
The ffmpeg-*-android-arm.zip packages *are* compiled versions of FFmpeg.
If the ABI of some other compiled version of FFmpeg is the same as
those, then we should be able to use them directly with JavaCV, if this
was your question
Samuel
Ah, nice trick. It's a bit hacky and probably not very efficient, but it
has the advantage of working on all devices... thanks!
> The second piece of my question is back to ffmpeg. I understand that
> the provided zip is compiled libraries. The problem I'm running into is
> that I want to be able to build more codecs in (there are many that
> aren't active currently, such as H.264 and VP8) down the road. I may be
> able to use the current libraries for proof of concept, but ultimately
> the ability to build those libraries from source (see:
> http://bambuser.com/opensource) would be a HUGE help if it still worked
> with the interface provided within javacv.
So, simply apply the patch I provide in the package, and recompile from
source. I am not sure where the problem is ...?
Samuel
Hum, why not start with FFmpegFrameRecorder and work your way from
there? AFAIK, FFmpegFrameRecorder works fine, so try to do the same
thing as that.
Samuel
The preview is done efficiently, yes, but if it's invisible, does it
actually skip any drawing entirely?
>> Hum, why not start with FFmpegFrameRecorder and work your way from
>> there? AFAIK, FFmpegFrameRecorder works fine, so try to do the same
>> thing as that.
> OK... I have been trying this and can't seem to get it right, either.
> The data I am getting is a byte[] of YUV420P, but I can't figure out
> how to load the data into the IplImage to use the FFMpegFrameRecorder
> directly. So, I tried to essentially follow the steps and can't get
> that to work either. I feel like I'm just missing something being
> initialized somewhere in this chain to get the frames. Part of the
> issue with the frame recorder is also that I don't want to write a
> file, I am looking to stream the data. Do you have any suggestions
> for where to look/try?
Ah, I am afraid I have never tried that. I am sure FFmpeg can be
tortured in many ways, but I have not attempted to encode a stream with
it alas... That's something FFmpeg experts could help you with though.
Samuel