Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

IAsyncReader implementation

853 views
Skip to first unread message

Robin

unread,
Apr 1, 2005, 10:49:28 AM4/1/05
to
Hello!

I used CSource and CSourceStream for my DirectShow source filter
to implement a transport mechanism based on the push-mode.
This is working fine, because the handling of the fillbuffer() method is
easy.

Now I also like to have support for the pull-mode IAsyncReader because
most mpeg2 demultiplexers need it.

But it seems not to be simple to do that.
So I also inherit from IAsyncReader in my outputpin class.
Now I have to implement these methods of this interface:

HRESULT IAsyncReader::RequestAllocator(IMemAllocator *,ALLOCATOR_PROPERTIES
*,IMemAllocator ** )
HRESULT IAsyncReader::Request(IMediaSample *,DWORD_PTR)
HRESULT IAsyncReader::WaitForNext(DWORD,IMediaSample ** ,DWORD_PTR *)
HRESULT IAsyncReader::SyncReadAligned(IMediaSample *)
HRESULT IAsyncReader::SyncRead(LONGLONG,LONG,BYTE *)
HRESULT IAsyncReader::Length(LONGLONG *,LONGLONG *)
HRESULT IAsyncReader::BeginFlush(void)
HRESULT IAsyncReader::EndFlush(void)

But this only is not working because in preparation phase on connecting
between my source
filter and the pull-mode demuxer (what is done by the graphbuilder) I got an
error : E_NOINTERFACE.

I have debugged it a little bit and this error occurs, because the hidden
function "AttemptConnect()"
ask for the wrong interface on the input-pin: IMemInputPin (the push model).
That's the reason for connection failure.
But it should ask for the IAsyncReader support (at least as a second try)!
So how I can adjust this?

Maybe, I cannot use CSource & CSourceStream to implement both transport
mechanisms (pull & push)
but then I have to start with CBaseFilter & CBasePin, so there is a lot of
more work to do...
for both mechanisms.

Thanks for any help.

Regards,
Robin Siegemund.


Iain

unread,
Apr 2, 2005, 5:00:03 AM4/2/05
to
On Fri, 1 Apr 2005 17:49:28 +0200, Robin wrote:

> Hello!
>
> I used CSource and CSourceStream for my DirectShow source filter
> to implement a transport mechanism based on the push-mode.
> This is working fine, because the handling of the fillbuffer() method is
> easy.
>
> Now I also like to have support for the pull-mode IAsyncReader because
> most mpeg2 demultiplexers need it.
>
> But it seems not to be simple to do that.
> So I also inherit from IAsyncReader in my outputpin class.
> Now I have to implement these methods of this interface:


This is probably a silly question, bu thave you looked at the AsyncFilter
sample? I've written one or two filters based around this and had no
particular problems doing so.

If nothing else you can compare it to what your code does...


Iain
--
Iain Downs (DirectShow MVP)
Software Product Consultant
www.idcl.co.uk

Robin

unread,
Apr 4, 2005, 4:29:14 AM4/4/05
to
> > Hello!
> >
> > I used CSource and CSourceStream for my DirectShow source filter
> > to implement a transport mechanism based on the push-mode.
> > This is working fine, because the handling of the fillbuffer() method is
> > easy.
> >
> > Now I also like to have support for the pull-mode IAsyncReader because
> > most mpeg2 demultiplexers need it.
> >
> > But it seems not to be simple to do that.
> > So I also inherit from IAsyncReader in my outputpin class.
> > Now I have to implement these methods of this interface:
>
>
> This is probably a silly question, bu thave you looked at the AsyncFilter
> sample? I've written one or two filters based around this and had no
> particular problems doing so.
>
> If nothing else you can compare it to what your code does...
>
>
> Iain

I don't see your point! The microsoft example is not using CSource and
CSourceStream.
And this is where my question occured: Can I implement the IAsyncReader
based on these classes and NOT on CBaseFilter / CBasePin like in the sample.
This should be the most easiest way to get both modes (pull/push) together.

Sorry, but do better reading (and quoting) next time, before call my
questions silly.

Robin.


Alessandro Angeli [MVP::DigitalMedia]

unread,
Apr 4, 2005, 6:41:30 AM4/4/05
to
Robin wrote:

> I don't see your point! The microsoft example is not
> using CSource and CSourceStream.

And that is Iain's point.

> And this is where my question occured: Can I implement
> the IAsyncReader based on these classes and NOT on
> CBaseFilter / CBasePin like in the sample. This should be
> the most easiest way to get both modes (pull/push)
> together.

CSource/CSourceStream (from the BaseClasses) are base
classes useful to write push-mode source filters that use
IMemInputPin.

CAsyncReader/CAsyncStream (from the Async sample) are base
classes useful to write pull-mode source filters that
implement IAsyncReader.

If you want to mix the 2 transports you need to write a
filter from scratch or using more primitive base classes or
you need to heavily modify one pair of the above base
classes to also support the other transport.

> Sorry, but do better reading (and quoting) next time,

What about some RTFM instead?

> before call my questions silly.

Iain didn't call *your* question silly but *his* own.


--

// Alessandro Angeli
// MVP :: Digital Media
// a dot angeli at psynet dot net


Robin

unread,
Apr 5, 2005, 4:57:17 AM4/5/05
to

Ok, thanks and sorry for my mistake on the silly thing...

The problem on the microsoft example is, that it is using a lot
of seperated classes mixed together, so to get an overview
is not so easy.

Now I have a first implementation of IAsyncSource based
on the ideas of the mircosoft sample but I do not using their
classes.
Because I need no file access... and this is the problem now.

Are their any resource available, how it's best to use
IAsyncSource (pull mode) with network sources instead
file sources?

Because the most useful mpeg2 demultiplexers only
support the pull interface.
But I have no file access... I get a stream from network.

Regards,
Robin.

Alessandro Angeli [MVP::DigitalMedia]

unread,
Apr 5, 2005, 6:28:04 AM4/5/05
to
Robin wrote:

> The problem on the microsoft example is, that it is using
> a lot of seperated classes mixed together, so to get an
> overview
> is not so easy.
>
> Now I have a first implementation of IAsyncSource based
> on the ideas of the mircosoft sample but I do not using
> their classes.
> Because I need no file access... and this is the problem
> now.

From the Async sample you only need 4 files: asyncio.cpp,
asyncio.h, asyncrdr.cpp, asyncrdr.h. The only modification
needed IIRC is to comment out #include "asyncflt.h" in
asyncrdr.cpp.

Derive your filter from CAsyncReader and its output pin from
CAsyncStream.

Add whatever initialization interface you like to the filter
(I used IFileSourceFilter, but a custom interface would be
ok, too).

Override the following methods on the pin: SetPointer(),
Read(), Size(), Alignment(), Lock(), Unlock(). The
Lock()/Unlock() pair can just execute a Lock()/Unlock() on a
private member CCritSec. The Alignment() can just return 1
(or whatever packet size your network protocol requires).
SetPointer(), Read(), Size() must be implemented according
to how your protocol works (SetPointer() and Size() may need
to return an error).

> Are their any resource available, how it's best to use
> IAsyncSource (pull mode) with network sources instead
> file sources?
>
> Because the most useful mpeg2 demultiplexers only
> support the pull interface.
> But I have no file access... I get a stream from network.

Pull-mode uses random access to the source stream where all
the stream is available at any moment while a network source
is usually a sequential stream where only the current
samples are available so it is not easy to write a pull-mode
source filter for a network source, hence your problems. You
need to do any needed buffering to simulate random access:
this what the URLReader does, to the extent that it fully
transfers a stream an buffers it to disk when a request is
made to read a part of the stream that it does not yet have.

Robin

unread,
Apr 5, 2005, 8:14:00 AM4/5/05
to
Hello Alessandro!

Thanks for your verbose comments.

Actually I don't implement the IAsyncReader interface in this
way, how you have described it.
I only use 2 classes with these methods (only methods declarations listed):

class CWrapperStream :
public IAsyncReader,
public CBasePin
{
CWrapperStream(HRESULT *phr, CDShowWrapper *pParent, LPCWSTR pPinName);

STDMETHODIMP NonDelegatingQueryInterface(REFIID, void**);

STDMETHODIMP Connect(IPin * pReceivePin, const AM_MEDIA_TYPE *pmt);

HRESULT InitAllocator(IMemAllocator **ppAlloc);

STDMETHODIMP RequestAllocator(IMemAllocator *,ALLOCATOR_PROPERTIES
*,IMemAllocator ** );
STDMETHODIMP Request(IMediaSample *,DWORD_PTR);
STDMETHODIMP WaitForNext(DWORD,IMediaSample ** ,DWORD_PTR *);
STDMETHODIMP SyncReadAligned(IMediaSample *);
STDMETHODIMP SyncRead(LONGLONG,LONG,BYTE *);
STDMETHODIMP Length(LONGLONG *,LONGLONG *);
STDMETHODIMP BeginFlush(void);
STDMETHODIMP EndFlush(void);

HRESULT GetMediaType(int iPosition, CMediaType *pmt);
HRESULT CheckMediaType(const CMediaType *pMediaType);
};

class CDShowWrapper :
public CBaseFilter,
public IDShowWrapper
{
CDShowWrapper(LPUNKNOWN lpunk, HRESULT *phr);
~CDShowWrapper();
static CUnknown *WINAPI CreateInstance(LPUNKNOWN punk, HRESULT *phr);

int GetPinCount(){return 1;}
CBasePin *GetPin(int n);
};

From the IAsyncInterface I have only implemented the methods
Length(), SynRead(), RequestAllocator() because the other methods
will be not called by any demultiplexer so far (I tested it with deb.
breakpoints).

As you have written, I use streaming over network and
have no file access.
So, in my application (that is using this filter) I established a simple
buffer mechanism for prebuffering data, so the demuxer
could config itself with some data before playing.

But there are 2 big problems so far:
1.
Some (pull mode, of course) demultiplexers don't connect
because the AttemptConnect() Method in the background producing this error:
"Cannot modify or delete an object that was added using the COM+ Admin SDK".
I don't understand what's going wrong here.

2.
When the demultiplexer connects (where the first problem does not occur),
it is working with file-access... BUT only with correct initialisations
in the Length() Method of the IASyncInterface.
So I have to limit the access to the size of my prebuffer, something like
this:

STDMETHODIMP CWrapperStream::Length(LONGLONG *pTotal,LONGLONG *pAvailable)
{
*pTotal=buffersize;
*pAvailable=buffersize;
return S_OK;
}

But then, it will only play the stream to the specified buffersize (some
frames).
If I specify *pTotal to a much bigger value, I have an illegal access like
problem.

The only may working trick I can see here is to set the size of pTotal later
to a very big size, after the automatic demuxer precaching/configuring.

But I have no access to this specified pTotal length, isn't it? ;-(

Regards,
Robin Siegemund


> From the Async sample you only need 4 files: asyncio.cpp,
> asyncio.h, asyncrdr.cpp, asyncrdr.h. The only modification
> needed IIRC is to comment out #include "asyncflt.h" in
> asyncrdr.cpp.
>
> Derive your filter from CAsyncReader and its output pin from
> CAsyncStream.
>
> Add whatever initialization interface you like to the filter
> (I used IFileSourceFilter, but a custom interface would be
> ok, too).
>
> Override the following methods on the pin: SetPointer(),
> Read(), Size(), Alignment(), Lock(), Unlock(). The
> Lock()/Unlock() pair can just execute a Lock()/Unlock() on a
> private member CCritSec. The Alignment() can just return 1
> (or whatever packet size your network protocol requires).
> SetPointer(), Read(), Size() must be implemented according
> to how your protocol works (SetPointer() and Size() may need
> to return an error).
>

Alessandro Angeli [MVP::DigitalMedia]

unread,
Apr 5, 2005, 9:35:27 AM4/5/05
to
Robin wrote:

> 1.
> Some (pull mode, of course) demultiplexers don't connect
> because the AttemptConnect() Method in the background
> producing this error: "Cannot modify or delete an object
> that was added using the COM+ Admin SDK". I don't
> understand what's going wrong here.

The HRESULT would be much more informative than the message.

> 2.
> When the demultiplexer connects (where the first problem
> does not occur), it is working with file-access... BUT
> only with correct initialisations
> in the Length() Method of the IASyncInterface.
> So I have to limit the access to the size of my
> prebuffer, something like this:
>
> STDMETHODIMP CWrapperStream::Length(LONGLONG
> *pTotal,LONGLONG *pAvailable) {
> *pTotal=buffersize;
> *pAvailable=buffersize;
> return S_OK;
> }
>
> But then, it will only play the stream to the specified
> buffersize (some frames).
> If I specify *pTotal to a much bigger value, I have an
> illegal access like problem.

Well, this looks like a bug in your implementation. Did you
check (with a debugger, maybe) where in the call stack the
exception occours?

> The only may working trick I can see here is to set the
> size of pTotal later to a very big size, after the
> automatic demuxer precaching/configuring.

I don't think you'll find many splitters (or players) that
will notice when the total length has changed. What splitter
did you try?

> But I have no access to this specified pTotal length,
> isn't it? ;-(

That depends on your protocol.

Did you try to set both the total and available sizes to the
actual length (or some large value), as if the stream where
local, then, when asked to read a segment you haven't yet
received, block until you receive it.

Robin

unread,
Apr 7, 2005, 4:26:32 AM4/7/05
to
Hi Alessandro!
Thanks for answer.

The problem with the size of the stream seems to be solved.
I simulated (for test purposes) a limited access to the first 100KB
of a file and it's working. So it should work with a seperated buffer later,
too.

And hey, the errormessage "Cannot modify or delete an object


that was added using the COM+ Admin SDK"

IS the HRESULT message.

Greetings,
Robin Siegemund

Alessandro Angeli [MVP::DigitalMedia]

unread,
Apr 7, 2005, 4:44:50 AM4/7/05
to
Robin wrote:

> And hey, the errormessage "Cannot modify or delete an
> object
> that was added using the COM+ Admin SDK"
> IS the HRESULT message.

I don't doubt that's the message associated with the HRESULT
but that's not the HRESULT itself since an HRESULT is a 32
bit integer which conveys more information than the message
and can not be really inferred from the message.

Peter Sun

unread,
Jul 25, 2006, 5:08:35 AM7/25/06
to
Sorry for inserting another question here.

I'm implementing a source filter base on CAsyncReader/CAsyncStream for a
pull-mode MEPG2-PS source filter to work with default MPEG-2 Demultiplexer.

The problem is either my filter nor the AsyncFilter sample that implements
from the CAsyncReader/CAsyncStream are having problem to connect with
default MPEG-2 Demultiplexer. But the default File Source (Async.) doesn't
have the above problem

Is there any one knows how to make AsyncFilter sample works with properly
default MPEG-2 Demultiplexer?

Thanks for any kind of help.

Regards,
Peter

Alessandro Angeli [MVP::DS/MF]

unread,
Jul 25, 2006, 7:33:03 AM7/25/06
to
Peter Sun wrote:

> Sorry for inserting another question here.

Please don't post the same question in different threads.
It's annoying and makes traking your problem harder.

--
// Alessandro Angeli
// MVP :: DirectShow / MediaFoundation

Gajendran

unread,
May 30, 2008, 11:27:20 AM5/30/08
to
Hello

I need help to develop a PUSH source filter. I am getting data from a USB
device and store it in a BYTE buffer but I dont know how to give this data
to a output pin. Please guide me, I dont want to parse the data received, I
just want to deliver the data as it is from the device to the Output pin.

Please advice.

url:http://www.ureader.com/msg/1471149.aspx

Alessandro Angeli

unread,
May 30, 2008, 11:38:08 AM5/30/08
to
From: "Gajendran"

> I need help to develop a PUSH source filter. I am getting
> data from a USB device and store it in a BYTE buffer but
> I dont know how to give this data to a output pin. Please
> guide me, I dont want to parse the data received, I just
> want to deliver the data as it is from the device to the
> Output pin.

1. A push source filter does not implement IAsyncReader on
its output pins but uses the downstream IMemInputPin.

2. How should we know how your internal code works? How is
your source filter/pin implemented?
CBaseFilter/CBaseOutpuPin, CSource/CSourceStream, custom
code... Is it C++ or C# or...?

--
// Alessandro Angeli
// MVP :: DirectShow / MediaFoundation

// mvpnews at riseoftheants dot com
// http://www.riseoftheants.com/mmx/faq.htm


Gajendran

unread,
May 31, 2008, 2:10:56 AM5/31/08
to
Hello,

I am new to the DirecShow and tried the PUSH source sample filter but I am
very much confused.I understand the code but I dont know about the methods
GetDeliveryBuffer, GetBuffer, FillBuffer, GetMediaType, SetMediaType,
Deliver and DecideBufferSize what they are doing. I dont how this method has
been called and when it will be called.

In the DirectShow PUSH source sample they are converting the pixel data into
VIDEOINFO but in my case I am getting a Transport Stream data and I just
want that TS data to deliver at the output pin.

I am implementing in VC++. I just started implementing from the DirectShow
PUSH Source code.

url:http://www.ureader.com/msg/1471149.aspx

xuanvu...@gmail.com

unread,
Jul 3, 2015, 1:51:35 AM7/3/15
to
Hello, I'm using DirectShowNet lib with using DirectShow in .NET to connect my DVR device. I was shown video in my program and capture it to bitmap images. Problem was when load url stream from my RTSP stream It's so slow. at function
IFileSourceFilter.load(url,ammedia)

Here's my code.


///////////////////

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Linq;
using System.Runtime.InteropServices;
using System.Text;
using System.Threading;
using System.Windows.Forms;
using System.Windows.Media;
using DirectShowLib;

namespace TestCam
{
#region Video guids
[ComVisible(false)]
internal class MediaTypes
{
public static readonly Guid Video = new Guid(0x73646976, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid Interleaved = new Guid(0x73766169, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid Audio = new Guid(0x73647561, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid Text = new Guid(0x73747874, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid Stream = new Guid(0xE436EB83, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
}

// ReSharper disable InconsistentNaming
[ComVisible(false)]
internal class MediaSubTypes
{
public static readonly Guid UYVY = new Guid(0x55595659, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid YUYV = new Guid(0x56595559, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid IYUV = new Guid(0x56555949, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid DVSD = new Guid(0x44535644, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71);
public static readonly Guid RGB1 = new Guid(0xE436EB78, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid RGB4 = new Guid(0xE436EB79, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid RGB8 = new Guid(0xE436EB7A, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid RGB565 = new Guid(0xE436EB7B, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid RGB555 = new Guid(0xE436EB7C, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid RGB24 = new Guid(0xE436Eb7D, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid RGB32 = new Guid(0xE436EB7E, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid Avi = new Guid(0xE436EB88, 0x524F, 0x11CE, 0x9F, 0x53, 0x00, 0x20, 0xAF, 0x0B, 0xA7, 0x70);
public static readonly Guid Asf = new Guid(0x3DB80F90, 0x9412, 0x11D1, 0xAD, 0xED, 0x00, 0x00, 0xF8, 0x75, 0x4B, 0x99);
}
// ReSharper restore InconsistentNaming
#endregion
//public delegate void MyLoveEvent( camProcess e);

public class camProcess
{
string _url;
//public string url { get { return _url; } set { _url = value; } }
/// <summary>
/// The RTSP filter has been created
/// </summary>
///
//public MyLoveEvent eventLoveKiss;
public void Onloopkiss()
{

owner.Invoke(new MethodInvoker(delegate
{
//Create Sample Grabber Callback
_capGrabber = new SampleGrabberCallback();
_graph = (IGraphBuilder)new FilterGraph();
_video = (IVideoWindow)_graph;

BuildGraph(_graph, _url);


rot = new DsROTEntry(_graph);
IntPtr ownerhandle = IntPtr.Zero;
ownerhandle = owner.Handle;
_video.put_Owner(ownerhandle);
this._video.put_WindowStyle(WindowStyle.Child | WindowStyle.ClipChildren | WindowStyle.ClipSiblings);

if (this._video != null)
{
this._video.SetWindowPosition(0, 0, owner.ClientSize.Width, owner.ClientSize.Height);
}

// Make the video window visible, now that it is properly positioned
this._video.put_Visible(OABool.True);
int hr = MediaControl.Run();
}));
}
private void RaisePropertyChanged(string propertyName)
{
var handler = PropertyChanged;

if (handler != null)
{
handler(this, new PropertyChangedEventArgs(propertyName));
}
}
public event PropertyChangedEventHandler PropertyChanged;
/// <summary>
/// The sample grabber has been created
/// </summary>
public bool Sample
{
get { return _sample; }
set { _sample = value; RaisePropertyChanged("Sample"); }
}
private bool _sample;
private Control _ownner;
public Control owner { get { return _ownner; } set { _ownner = value; } }
private Control _capture;
public Control capture { get { return _capture; } set { _capture = value; } }

/// <summary>
/// The video renderer has been created
/// </summary>
public bool Render
{
get { return _render; }
set { _render = value; RaisePropertyChanged("Render"); }
}
private bool _render;
public bool Filter
{
get { return _filter; }
set { _filter = value; RaisePropertyChanged("Filter"); }
}
/// <summary>
/// Was an error detected constructing the graph
/// </summary>
public bool Fault
{
get { return _fault; }
set { _fault = value; RaisePropertyChanged("Fault"); }
}

/// <summary>
/// The time the first frame was received set by sample grabber
/// <seealso cref="SampleGrabberCallback"/>
/// </summary>
public string FirstFrame
{
get { return _firstFrame; }
set { _firstFrame = value; RaisePropertyChanged("FirstFrame"); }
}
private string _firstFrame;
/// <summary>
/// Is the graph running
/// </summary>
///
public RelayCommand PlayCommand { get; set; }
public RelayCommand CaptureImageCommand { get; set; }
public RelayCommand StopCommand { get; set; }
public bool Running
{
get { return _running; }
set
{
_running = value;
RaisePropertyChanged("Running");
//UpdateCommands();
}
}
private void UpdateCommands()
{
PlayCommand.OnCanExecuteChanged();
StopCommand.OnCanExecuteChanged();
CaptureImageCommand.OnCanExecuteChanged();
}
public long BufferSize
{
get { return _bufferSize; }
set { _bufferSize = value; RaisePropertyChanged("BufferSize"); }
}
private long _bufferSize;
private bool _running;
private bool _fault;
private bool _filter;
private SampleGrabberCallback _capGrabber;
public IMediaEvent MediaEvent { get { return (IMediaEvent)_graph; } }
private IGraphBuilder _graph;
private ISampleGrabber _grabber;
private IVideoWindow _video;
public camProcess(string url, Control owner)
{
start(url, owner);
}
DsROTEntry rot = null;
/// <summary>
/// The media control interface for the graph
/// </summary>
public IMediaControl MediaControl { get { return (IMediaControl)_graph; } }
public bool start(string url, Control owner)
{
Application.CurrentCulture = System.Globalization.CultureInfo.InvariantCulture; // to get a dot as decimal separator when entering the frame rate
stop();
this._url = url;
this.owner = owner;
Onloopkiss();
return true;
}
private void BuildGraph(IGraphBuilder pGraph, string srcFile1)
{
//reset our properties
Filter = false;
Sample = false;
Render = false;
Running = false;
FirstFrame = string.Empty;

//graph builder
var pBuilder = (ICaptureGraphBuilder2)new CaptureGraphBuilder2();
int hr = pBuilder.SetFiltergraph(pGraph);
//CheckHr(hr, "Can't SetFiltergraph");

//add RTSP Filter
var pRTSPFilter2 = CreateSourceFilter(pGraph, srcFile1);
Filter = true;
var pSource = CreateFilter(pGraph);
//add Colorspace conveter
//*FYJ var pColorSpaceConverter = CreateColorSpace(pGraph);
//*FYJ var pColorSpaceConverter2 = CreateColorSpace(pGraph);
//add SampleGrabber
var pSampleGrabber = CreateSampleGrabber(pGraph);
//add Video Renderer
var pVideoRenderer = CreateVideoRenderer(pGraph);
//connect RTSP Filter and color space converter
//*FYJ hr = pGraph.ConnectDirect(GetPin(pRTSPFilter2, "Out"), GetPin(pColorSpaceConverter, "Input"), null);
//*FYJ //CheckHr(hr, "Can't connect RTSP Filter and Color space converter");
//*FYJ Color = true;

//connect color space converter and sample grabber
//*FYJ hr = pGraph.ConnectDirect(GetPin(pColorSpaceConverter, "XForm Out"), GetPin(pSampleGrabber, "Input"), null);
//*FYJ //CheckHr(hr, "Can't connect RTSP Filter and Color Space Converter and Sample Grabber");

//?? Do we really need a second color space converter??
//*FYJ hr = pGraph.ConnectDirect(GetPin(pSampleGrabber, "Output"), GetPin(pColorSpaceConverter2, "Input"), null);
//*FYJ //CheckHr(hr, "Can't connect RTSP Filter and Color Space Converter and Sample Grabber and Color converter 2");
//*FYJ Sample = true;

//add a renderer
//*FYJ hr = pGraph.ConnectDirect(GetPin(pColorSpaceConverter2, "XForm Out"), GetPin(pVideoRenderer, "VMR Input0"), null);
//*FYJ //CheckHr(hr, "Can't connect RTSP Filter and Color Space Converter and Sample Grabber and Color converter and video render");
//*FYJ Render = true;

pBuilder.RenderStream(null, null, pRTSPFilter2, pSampleGrabber, pVideoRenderer);

_grabber = pSampleGrabber as ISampleGrabber;
//Trace("Graph Complete");
InitializeSampleGrabber();
}
/// <summary>
/// Create the end line sample grabber. Ths is the one that
/// captures images from the stream
/// </summary>
/// <param name="pGraph"></param>
/// <returns></returns>
private IBaseFilter CreateSampleGrabber(IGraphBuilder pGraph)
{
var clsidSampleGrabber = new Guid("{C1F400A0-3F08-11D3-9F0B-006008039E37}"); //qedit.dll
var pSampleGrabber3 = (IBaseFilter)Activator.CreateInstance(Type.GetTypeFromCLSID(clsidSampleGrabber));
int hr = pGraph.AddFilter(pSampleGrabber3, "SampleGrabber");
//CheckHr(hr, "Can't add SampleGrabber to graph");
var pSampleGrabber3Pmt = new AMMediaType
{
majorType = MediaType.Video,
subType = MediaSubType.RGB32,
formatType = FormatType.VideoInfo,
fixedSizeSamples = true,
formatSize = 88,
sampleSize = 522240,
temporalCompression = false
};
var pSampleGrabber3Format = new VideoInfoHeader
{
SrcRect = new DsRect(),
TargetRect = new DsRect(),
BitRate = 94003294,
AvgTimePerFrame = 333333,
BmiHeader =
new BitmapInfoHeader
{
Size = 40,
Width = 480,
Height = 272,
Planes = 1,
BitCount = 32,
ImageSize = 522240
}
};
pSampleGrabber3Pmt.formatPtr = Marshal.AllocCoTaskMem(Marshal.SizeOf(pSampleGrabber3Format));
Marshal.StructureToPtr(pSampleGrabber3Format, pSampleGrabber3Pmt.formatPtr, false);
hr = ((ISampleGrabber)pSampleGrabber3).SetMediaType(pSampleGrabber3Pmt);
DsUtils.FreeAMMediaType(pSampleGrabber3Pmt);
//CheckHr(hr, "Can't set media type to sample grabber");

var grabber = pSampleGrabber3 as ISampleGrabber;
grabber.SetCallback(_capGrabber, 1);
return pSampleGrabber3;
}
/// <summary>
/// Creates a video render filter to display video in our sample
/// </summary>
/// <param name="pGraph"></param>
/// <returns></returns>
private IBaseFilter CreateVideoRenderer(IGraphBuilder pGraph)
{
var clsidVideoRenderer = new Guid("{B87BEB7B-8D29-423F-AE4D-6582C10175AC}"); //quartz.dll
var pVideoRenderer2 = (IBaseFilter)Activator.CreateInstance(Type.GetTypeFromCLSID(clsidVideoRenderer));
int hr = pGraph.AddFilter(pVideoRenderer2, "Video Renderer");
//CheckHr(hr, "Can't add Video Renderer to graph");
return pVideoRenderer2;
}
private IBaseFilter CreateFilter(IGraphBuilder pGraph)
{
var clsidVideoRenderer = new Guid("55D1139D-5E0D-4123-9AED-575D7B039569"); //quartz.dll
var pVideoRenderer2 = (IBaseFilter)Activator.CreateInstance(Type.GetTypeFromCLSID(clsidVideoRenderer));
int hr = pGraph.AddFilter(pVideoRenderer2, "Filter Ren");
//CheckHr(hr, "Can't add Video Renderer to graph");
return pVideoRenderer2;
}
/// <summary>
/// The required color space converter
/// </summary>
/// <param name="pGraph"></param>
/// <returns></returns>
private IBaseFilter CreateColorSpace(IGraphBuilder pGraph)
{
var pColorSpaceConverter3 = (IBaseFilter)new Colour();
int hr = pGraph.AddFilter(pColorSpaceConverter3, "Color Space Converter");
//CheckHr(hr, "Can't add Color Space Converter to graph");
return pColorSpaceConverter3;
}
/// <summary>
/// Create our RTSP source filter and load the
/// RTSP source url
/// </summary>
/// <param name="pGraph">The graph the filter will live in</param>
/// <param name="url">The URL to load into the filter</param>
/// <returns></returns>
private IBaseFilter CreateSourceFilter(IGraphBuilder pGraph, string url)
{

var clsidRTSPFilter = new Guid("55D1139D-5E0D-4123-9AED-575D7B039569"); //RTSPSource.ax
IBaseFilter pRTSPFilter2 = (IBaseFilter)Activator.CreateInstance(Type.GetTypeFromCLSID(clsidRTSPFilter));
int hr = pGraph.AddFilter(pRTSPFilter2, "RTSP Filter");
//CheckHr(hr, "Can't add RTSP Filter to graph");
//set source filename
var pRTSPFilter2Src = pRTSPFilter2 as IFileSourceFilter;
if (pRTSPFilter2Src != null)
{
AMMediaType media = new AMMediaType { majorType = MediaType.URLStream, subType = MediaSubType.H264 };
hr = pRTSPFilter2Src.Load(url, media);
DsUtils.FreeAMMediaType(media);
}
Filter = true;


return pRTSPFilter2;

}
/// <summary>
/// Helper function to get a pin from a filter
/// </summary>
/// <param name="filter"></param>
/// <param name="pinname"></param>
/// <returns></returns>
private IPin GetPin(IBaseFilter filter, string pinname)
{
IEnumPins epins;
int hr = filter.EnumPins(out epins);
//CheckHr(hr, "Can't enumerate pins");
IntPtr fetched = Marshal.AllocCoTaskMem(4);
var pins = new IPin[1];
while (epins.Next(1, pins, fetched) == 0)
{
PinInfo pinfo;
pins[0].QueryPinInfo(out pinfo);
bool found = (pinfo.name == pinname);
DsUtils.FreePinInfo(pinfo);
if (found)
return pins[0];
}
//CheckHr(-1, "Pin not found");
return null;
}
private void InitializeSampleGrabber()
{
var mediaType = new AMMediaType { majorType = MediaTypes.Video, subType = MediaSubTypes.RGB32 };
_grabber.GetConnectedMediaType(mediaType);
if (mediaType.formatType == FormatType.VideoInfo && mediaType.formatPtr != IntPtr.Zero)
{
var header = (VideoInfoHeader)Marshal.PtrToStructure(mediaType.formatPtr, typeof(VideoInfoHeader));

if (header != null && header.BmiHeader != null)
{
// Get the pixel count
var pcount = (uint)(header.BmiHeader.Width * header.BmiHeader.Height * PixelFormats.Bgr32.BitsPerPixel / 8);
_capGrabber.Initialize(pcount, header.BmiHeader.Width, header.BmiHeader.Height);
}
}
}


public bool stop()
{
if (this._video != null)
{
this._video.put_Visible(OABool.False);
this._video.put_Owner(IntPtr.Zero);
}

// Remove filter graph from the running object table
if (rot != null)
{
rot.Dispose();

rot = null;
}

if (this._video != null) { Marshal.ReleaseComObject(this._video); this._video = null; }
if (this._graph != null) { Marshal.ReleaseComObject(this._graph); this._graph = null; }

return true;
}
private string DecodeHrResult(uint hr)
{
string ret;
switch (hr)
{
case 0x8004020D:
ret = "Video Buffer does not match image size";
break;
case 0x8004022F:
ret = "Invalid RTSPsource url";
break;
default:
ret = string.Empty;
break;
}
return ret;
}
private void RunGraph()
{
try
{
Running = true;
while (Running)
{
Thread.Sleep(500);
EventCode ev;
IntPtr p1, p2;
Application.DoEvents();
while (MediaEvent.GetEvent(out ev, out p1, out p2, 0) == 0)
{
if (ev == EventCode.Complete || ev == EventCode.UserAbort || ev == EventCode.ErrorAbort)
{
if (ev == EventCode.ErrorAbort)
{
//Trace("ERROR: HRESULT={0:X} {1}", p1, msg);
}
MediaControl.Stop();
Running = false;
}
MediaEvent.FreeEventParams(ev, p1, p2);
}
}
_graph.Abort();
//Trace("Graph Closed");
}
catch (Exception ex)
{
//Trace("ERROR: Running Graph");
}
}
}
}
/////////////////////
How to I can do it faster
Thanks for any helps

Nguyễn Xuân Vương
0 new messages