Hello
I'm starting an application which will need to show VP9 streams on Linux framebuffer. I'm using a Raspberry Pi 3 to do it.
I built libvpx on it, I took a video and converted it to vp9 succesfully.
Now I'm trying to understand how can I use vpxdec to decode and show the video.
What I did so far was using the vpxdec to decode raw yuv frames. I did manage how to show those frames on the framebuffer, but the conversion is a bit slow (I need to convert from YUV to RGB, and all those multiplication needed takes some time). The video is 480x640, and I can convert and play about to 35 frames per second.
What I am trying to do is to output vpxdec to my program, I tried with a pipe, vpxdec sends the output to the stdout, and my program catches the data from stdin, convert to RGB and show on framebuffer. It worked nice, but it is really slow (I think I see 3 to 5 frames per second).
I would like to know if there is a better method to output the decoded video to framebuffer.
Sorry if this is a dumb question, I'm pretty new to this topic of VP9 and video streaming.
Best Regards!
Christian Schultz