The learning continues. The Verbier Festival is underway & I have been recording a
number of the concerts from Medici TV (
https://www.medici.tv/). These appear to be
available even if you do NOT have a full paying membership, which I don't. You generally
don't need a paying membership, just a free membership, to view their livestreams while
they are live. But quite often, after the livestream ends & they archive the show on the
site, they put it behind their paywall. So you have the odd situation where you have a
recording of a show that you made while it was live, but you can't view the archived
version of the same show if you aren't a paying member. But these Verbier concerts
appear to be available to non-paying members after the fact. I haven't looked at their
entire archive but I believe their archived copies of concerts from past years of the
Verbier Festival are behind their paywall. I am most appreciative of their making this
year's shows available for free.
In any case, I was wondering how I might get ffmpeg to record a complete livestream that
I might join in progress. Typically, they open their livestreams about 15 minutes before
show time. So if a show begins at 6:00 & I happen to join the livestream at 7:00, I
would want to tell ffmpeg to begin its recording at one hour & 15 minutes before the time
I launch ffmpeg. In other words, I wondered if there is a way to tell ffmpeg to rewind
to the beginning of a livestream. It appears that my Golf Channel recordings do that. I
was wondering if these Medici recordings might do the same.
I went huntng online for any advice. I found many search hits, a few of them
interesting, but none of them relevant. I finally searched with some search key I don't
remember & down at the bottom of one of the search pages I found this reference:
https://gist.github.com/s4y/46738a67c4bc842f1f02f09e1eaf23fd. This reference is no more
recent than March 2021, over a year ago now. I am always skeptical of anything I read
online about ffmpeg because it's been around so long & over the years it's changed a lot.
Plus my experience has been that not all the advice I try works. But I was willing to
give this one a shot.
The ffmpeg documentation (at least on Windows) is bundled in the zip file containing the
product. When you unzip the file you download from
ffmpeg.org, you'll get a subdirectory
named doc. It contains a number of html files with file names that at least suggest what
might be in the files. So you're supposed to open the documentation in your web browser.
I'd prefer PDFs or plain text files, but it is what it is. At least you aren't reliant
on their web site being up in order to access their documentation. In the ffmpeg-all
file, they have what I have to guess is the most extensive documentation of ffmpeg. In
order to verify the advice I read at github, I did a string search in that web page for
-live_start_index. I got no hits, so I was beginning to doubt whether I had found good
advice. Every ffmpeg parameter starts with a hyphen. If you scan through that
particular page of documentation, you will see that a lot of parameters are documented
with the leading hyphen. But it took me a while to discover that I was looking at a case
that didn't follow the standard. This sort of inconsistent documentation makes me crazy.
I have complained many times in this forum about how lousy the ffmpeg documentation is.
Here's another reason to complain. Anyway, it took me a while but I finally tried
looking for that parameter without the leading hyphen. Bingo. It's there. Lesson:
don't always prefix you searches for parameters with the hyphen. If the search comes up
empty, try again without the hyphen.
So I finally located the documentation for -live_start_index (absent the hyphen) in
section 20.10. This is a surprisingly short section that documents the HLS demuxer. It
says there that -live_start_index specifies the segment index at which to start a live
stream. Is that to start creating one or start receiving one? This brilliant
documentation leaves the answer to that question to the reader's imagination. I believe
this is typical of all Linux documentation. The people writing this stuff are
intentionally trying to discourage newbies from learning whatever it is they're
documenting for the express purpose of building their egos & keeping theirs a closed
club. Here you have yet one more instance of that. In any case, the github advice says
to code -99999. This tells me that the author there believed the parameter would back up
that many segments from current time, the time you launch ffmpeg, whenever that happens
to be, & skip back some unreasonably large amount to reach start. But the documentation
says that negative numbers count back from the end of the stream. I pondered all of this
& decided -99999 was not really the correct way to go. The value you really want is
simply 0. That is 0 instead of 1 because ffmpeg usually numbers things, various things,
starting at 0. So I assumed 0 would be correct here. Assumed. Not explicitly
documented. I was guessing. Again. As I usually do when it comes to this execrable
documentation.
I tried -live_start_index 0 and it works. It causes ffmpeg recordings from Medici
livestreams to rewind to the start. Hooray! I was so happy to have finally found
something that worked. But I have given this some thought. It is quite likely that the
success of this parameter relies rather a great deal on the way the server presents the
livestream. On Medici, their livestreams rarely last more that a few hours. Skipping
back to 0 on Medici isn't such a long time. But suppose you're looking at the livestream
of your favorite public radio station, the one that broadcasts the Metropolitan Opera on
Saturday afternoons during their season. When is the beginning of their livestream?
That radio station streams live 24x7, most likely. The beginning of their livestream is
not within reach. What would you code for such a stream to rewind it only an hour or
two? I haven't figured that out. So this parameter is probably not universally
applicable. But you'll probably find plenty of cases in which it will be very useful.
Still, I am putting up the warning that it might not work on every site. I can imagine
the case of a site that might have been livestreaming something for 3 hours but
-live_start_index 0 might take you back only half an hour or an hour. On that 24x7 radio
site, -live_start_index 0 might go back a few hours. It could easily not work at all.
It all depends on the way the site presents the livestream. So keep this in mind.
I've been a bit vague so far on where exactly to code -live_start_index. This is
intentional because the story is actually a bit complicated. In that section 20.10 of
the ffmpeg documentation, I noticed a couple of other parameters documented just below
-live_start_index. It so happens that when I have recorded livestreams from Medici, it
seems like the livestream ends but ffmpeg sits there reading the manifest over & over
1000 times before it finally decides to terminate its recording. During those 1000
reads, ffmpeg writes no blocks to the output file. But this takes about 40 minutes. All
of this activity is logged by ffmpeg. If you read the documentation in section 20.10 for
the next few parameters after -live_start_index, you'll see 2 of them have default values
of 1000. Plus the names of the parameters seem particularly relevant. I decided this
can't be a coincidence & started experimenting with them.
When I say experimenting, I mean trial & error, lots of trials & an embarrassingly large
number of errors. When I put those 2 parameters into my invocation of ffmpeg, I kept
getting the syntax error, "Option not found." In order to understand what did finally
work, I need to explain the basic structure of the ffmpeg command. I've kind of done
that upthread here but I need to revisit the subject. The ffmpeg command boils down to
this:
ffmpeg -i <input> <output>
<input> can be many things, a URL of a manifest online, a URL of an MP4 or some other
media type online, the file specification for a manifest on your system, the file
specification of a media file on your system, it all depends on the function you want to
perform.
<output> is usually a file specification of a media file you want ffmpeg to create.
So the command has 2 gaps in it. The first gap is between ffmpeg & -i. The second gap
is between <input> & <output>. The second gap is easy to explain. The ffmpeg parameters
you code in the second gap apply to <output>.
The first gap is a little more complicated. There are 2 classes of ffmpeg parameters
that you code in the first gap. The class on the left, the parameters you code first,
are global parameters that apply to the overall execution of the command. The class on
the right, the parameters you code second, are parameters that apply to -i <input>,
parameters that apply to the input file. Among my many errors, I found that if you have
an input parameter & you follow it wih a global parameter, ffmpeg will give you the
syntax error "Option not found." It was a valid option, but I was coding it in the wrong
position, so ffmpeg couldn't find it. My experiments have led me to believe that the
-protocol_whitelist parameter that I have mentioned multiple times upthread here is an
input parameter. I think it is global in nature but it seems it is defined in the world
of ffmpeg as an input parameter. On the other hand, it seems that the -hwaccel parameter
you can find mentioned upthread is a global parameter, which at least matches my sense of
what it is. Further, it seems -live_start_index is an input parameter. So this is what
I have used to record Medici livestreams rewound to the beginning & not wasting 1000
useless reads of a manifest after the livestream has ended.
ffmpeg -m3u8_hold_counters 25 -max_reload 25 -hwaccel auto -protocol_whitelist file,crypto,data,http,https,tls,tcp -live_start_index 0 -i <input> outputparms <output>
That's all one line. Google may be displaying it wrapped & folded on your screen. It's
a single command line.
I am not entirely sure which of -m3u8_hold_counters or -max_reload limits ffmpeg to
rereading the manifest only 25 times after the livestream ends. But I've got them both
coded & it works.
Among my many errors, I have found that you can't code these parameters on just any sort
of input. If you're just downloading a file, the parameters as I show them above cause
the "Option not found" error. So it seems ffmpeg figures out what type of input you're
dealing with & parses the parameters accordingly. I have taken special care to code
-m3u8_hold_counters, -max_reload, & -live_start_index ONLY when I'm recording a
livestream. Those parameters are relevant only for livestreams anyway. But I get syntax
errors for that exact same coding pattern when it's not a livestream. So be careful with
it.
Something I learned from my many errors is that ffmpeg appears to scan the command line
right to left. One thing that makes me think this is that the global parameters & input
file parameters appear to depend on the type of input file specified. It seems logical
that the only way it could treat those parameters differently depending on the type of
input file is to be scanning the command line right to left. A second piece of evidence
is the way it was generating error messages. When I was making errors with the
-m3u8_hold_counters & -max_reload parameters, the error messages I got led me to believe
it was scanning right to left. When I coded -m3u8_hold_counters followed by -max_reload,
it threw the error message on -max_reload, then terminated command processing. When I
coded -max_reload followed by -m3u8_hold_counters, it threw the error message on
-m3u8_hold_counters, then terminated the command processing. It was always the rightmost
parameter that got flagged. It never flagged them both, even though they were both in
error. I don't know whether this right to left processing is documented. It might be.
I haven't looked. It's probably not really important for successfully using the command
to do downloads. I just found it interesting.
When a livestream ends, the serving web site is supposed to delete the manifest. When
that happens, ffmpeg reads the manifest it has been reading since the livestream began &
gets a 404 Not Found errorr. It looks like it tries again maybe once or twice, then
finalizes the output file & terminates. I have observed this with some of my Medici
downloads. But sometimes the serving web site doesn't remove the manifest from its site.
The majority of my Medici livestreams are like this. Before I found the
-m3u8_hold_counters & -max_reload parameters, ffmpeg would read the manifest 1000 times
without writing any data to the output file. This appeared to take about 40 minutes.
That works out to about 2.5 seconds per read of the manifest. Once the 1000 reads were
completed, ffmpeg finalized the output file & terminated execution. When I added
-m3u8_hold_counters 25 & -max_reload 25 to my ffmpeg invocations, ffmpeg read the
manifest only 25 times before deciding the livestream was over. So the
-m3u8_hold_counters & -max_reload parameters are something of a safeguard against a
livestream that is not terminated properly. I have found that parameter values of 25
work well enough for Medici livestreams. That works out to a delay of a bit over 1
minute after the livestream ends before ffmpeg decides to stop recording. This might not
work well on all serving sites. You'll have to experiment with the values for these
parameters. Maybe 50 would work better for your case, maybe 100. You'll have to
experiment with it to figure it out for your case. My guess is that -m3u8_hold_counters
is the parameter that is actually taking effect to end my Medici livestreams. But I have
not experimented with coding only -m3u8_hold_counters without -max_reload or only
-max_reload without -m3u8_hold_counters to test that hypothesis. I figure I'm covering
all the bases by coding both parameters & I'll just leave it like that until I hit a case
that makes me re-evaluate what I've done.
Now look again at that section 20.10 of the ffmpeg documentation. Do you see where they
explain that one parameter is an input parameter but the others are global parameters?
Do you see where they explain that you can't code these parameters on just any old ffmpeg
invocation? No. I had to suffer through countless trials & errors to learn these
things. Every day I look at this documentation my hatred for it grows. Maybe I've saved
you some trouble. I hope you find this latest update helpful.