As you know, I've been trying to sort out the interlaced handling, and
I've got a puzzling situation.
I have a number of 1080i PAFF samples and they seem to exhibit very
strange pipeline behaviour.
Even once it's started delivering frames, and I'm keeping the in:out
ratio 1:1, I reach a point where
it will refuse to return the second field - and this point seems to be
related to how many frames I
fed into the decoder before I started reading them back - so if I feed
20 frames before I try to
read an output frame, then I can read 20 frames before it will block
and say it has nothing ready.
if I feed 128 frames, it will give me 128, etc.
So, it seems like the decoder is actually not processing frames once I
start extracting them. Does
that make any sense?
If I don't enforce that both fields must be returned for a single
input, then what happens is each field
gets returned on separate inputs and it doesn't block, but the
pipeline grows by one for every output
frame, and eventually fills up.
I don't see this problem with 1080p content, even at very high bitrates.
--phil
Ok; I realise what's going on. Paff fields are separate in the input stream unlike mpeg2 or mbaff - so I'm actually draining too fast.
---phil