How to use libwebm ?

2,485 views
Skip to first unread message

Timothy Madden

unread,
Apr 16, 2014, 6:15:46 PM4/16/14
to webm-d...@webmproject.org
Hello

I am trying to output a short .webm video from my C++ application (with a 10-min screen capture).

I have yet found no documentation on how to create and use a "muxer" that should receive encoded data (the vpx_fixed_buffer) from vpx_codec_encode(), and save it in a nice .webm file. Should I just read the libwbem source and find the functions open/write/close .webm files in one of the header files ? Or vpxenc source ?

Thank you,
Timothy Madden

Brendan Bolles

unread,
Apr 16, 2014, 6:40:49 PM4/16/14
to webm-d...@webmproject.org
On Apr 16, 2014, at 3:15 PM, Timothy Madden wrote:

> I have yet found no documentation on how to create and use a "muxer" that should receive encoded data (the vpx_fixed_buffer) from vpx_codec_encode(), and save it in a nice .webm file. Should I just read the libwbem source and find the functions open/write/close .webm files in one of the header files ? Or vpxenc source ?


I use libwebm and libvpx in my Premiere plug-in that you can see here:

http://github.com/fnordware/AdobeWebM


Also check out libwebm/mkvmuxer.cpp for an example.


Basically...

mkvmuxer::IMkvWriter writer; // create your own sub-class
mkvmuxer::Segment segment;

segment.Init(&writer);

int vid_track = segment.AddVideoTrack(width, height, 1);

// do many times
segment.AddFrame(data, size, vid_track, timeStamp, kf);

segment.Finalize();



Brendan

Frank Galligan

unread,
Apr 17, 2014, 12:40:15 PM4/17/14
to WebM Discussion
Vignesh recently added libwebm to vpxdec [1], which you can use as another example.




--
You received this message because you are subscribed to the Google Groups "WebM Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to webm-discuss...@webmproject.org.
To post to this group, send email to webm-d...@webmproject.org.
Visit this group at http://groups.google.com/a/webmproject.org/group/webm-discuss/.
For more options, visit https://groups.google.com/a/webmproject.org/d/optout.

austin...@gmail.com

unread,
Jan 15, 2018, 7:26:37 AM1/15/18
to webm-d...@webmproject.org

Hi Brendan
Can you please help me to understand the timeStamp used in AddFrame API.
I have the timestamp from RTP packet. Should I pass that timestamp here or I need to modify.
If I have to modify then how?

Thanks
Austin

Artiom Khachaturian

unread,
Sep 8, 2023, 7:43:34 PM9/8/23
to WebM Discussion, austin...@gmail.com
1. At a minimum, you need to collect RTP packets into one frame (for VPX codecs, not relevant for OPUS for example) - collect these RTP packets into a temporary buffer, check the marker flag for each packet and, if there is one, combine the packets into one frame, timestamp the last packet should be considered as the TS of the entire frame
2. MKV timestamps in nanoseconds, monotonically increasing, for the 1st frame it is 0 (zero),  for the 2nd is the difference with the previous one and is converted to nano:

template<typename T>

inline constexpr uint64_t ValueToNano(T value) {

    return value * 1000ULL * 1000ULL * 1000ULL;

}

uint64_t TrackInfo::UpdateTimeStamp(uint32_t lastRtpTimestamp)

{

 // _granule & _lastRtpTimestamp is zero initially 

    const auto current = ValueToNano(_granule) / GetClockRate();

    if (lastRtpTimestamp > _lastRtpTimestamp) {

        if (_lastRtpTimestamp) {

            _granule += lastRtpTimestamp - _lastRtpTimestamp;

        }

        _lastRtpTimestamp = lastRtpTimestamp;

    }

    return current;

}


GetClockRate - it's a sample/clock rate of your media signal

Reply all
Reply to author
Forward
0 new messages