encoder stops returning packets after first frame

62 views
Skip to first unread message

yos...@hyper-room.io

unread,
Oct 17, 2016, 1:58:50 PM10/17/16
to Codec Developers
Would appreciate any help figuring this out. Currently writing a wrapper for .NET. I'm taking in rgb24 data, converting it to i420 and handing it off to an encoder.

Every time I call vpx_codec_encode it's returning VPX_CODEC_OK. And the first time I call  vpx_codec_get_cx_data I get a packet. But every time after that when I call vpx_codec_encode and then call vpx_codec_get_cx_data the returned packet is null.

Below are my C++ wrappers for the encoding and packet retrieval. I reviewed simple_encoder.c and can't figure out what's different. Does anyone notice anything egregiously erroneous here?:

    bool Encode(vpx_codec_ctx_t* codec, vpx_codec_iter_t* iter, vpx_image_t* image, int timeStamp)
    {
        iter = NULL;
        const vpx_codec_err_t res = vpx_codec_encode(codec, image, timeStamp, 1, 0, VPX_DL_BEST_QUALITY);
        if (res != VPX_CODEC_OK) {
            return false;
        }
        else {
            return true;
        }
    }

    bool GetEncodedPacket(vpx_codec_ctx_t* codec, vpx_codec_iter_t* iter, char * data, int* size)
    {
        const vpx_codec_cx_pkt_t* pkt = vpx_codec_get_cx_data(codec, iter);
        if (pkt != NULL && pkt->kind == VPX_CODEC_CX_FRAME_PKT)
        {
            if (pkt->kind != VPX_CODEC_CX_FRAME_PKT)
                return 0;

            *size = pkt->data.frame.sz;
            memcpy(data, pkt->data.frame.buf, pkt->data.frame.sz);
            return true;
        }
        return false;
    }

This is the rest of my code:

#define encInterface (vpx_codec_vp8_cx())
 
    int Init(vpx_codec_ctx_t* codec, vpx_codec_enc_cfg_t* cfg, int width, int height, int targetBitRate) {
        int res = vpx_codec_enc_config_default(encInterface, cfg, 0);
        if (targetBitRate != 0)
            cfg->rc_target_bitrate = targetBitRate;
        cfg->g_w = width;
        cfg->g_h = height;
        cfg->g_timebase.num = 1;
        cfg->g_timebase.den = 24;//fixed at 24fps
        cfg->g_error_resilient = true;
        return vpx_codec_enc_init(codec, encInterface, cfg, 0);
    }

    vpx_image_t* CreateImage(int width, int height) {
        vpx_image_t* image= new vpx_image_t();
        vpx_img_alloc(image, VPX_IMG_FMT_I420, width, height, 1);
        return image;
    }

    void WriteToImage(vpx_image_t* image, unsigned char * data) {
        int toRead = image->d_w * image->d_h * 3 / 2;
        memcpy(image->img_data, data, toRead);
    }


Code was compiled in Visual Studio 2015 Community Edition. Project was generated with the config of: configure --as=yasm --target=x86_64-win64-vs14 --enable-static-msvcrt

Tom Finegan

unread,
Oct 19, 2016, 3:16:31 PM10/19/16
to codec...@webmproject.org
Nothing is jumping out at me in the code you've provided. It sounds like you might be getting caught by the default value for g_lag_in_frames[1]. Try setting that to 0 to disable lagged encoding.


--
You received this message because you are subscribed to the Google Groups "Codec Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to codec-devel+unsubscribe@webmproject.org.
To post to this group, send email to codec...@webmproject.org.
Visit this group at https://groups.google.com/a/webmproject.org/group/codec-devel/.
For more options, visit https://groups.google.com/a/webmproject.org/d/optout.

Reply all
Reply to author
Forward
0 new messages