Thursday, May 22, 2008
LNK1103 Error
------------
"linker error uuid.lib(Servpro_i.obj) : fatal error LNK1103: debugging information corrupt; recompile module"
Solution:
-----------------
The cause for the "linker error uuid.lib(Servpro_i.obj) : fatal error LNK1103: debugging information corrupt; recompile module" is the uuid.lib is in link->object/library module and there is an entry like $(MSSDK)\Lib, in additional lib path alsowhen i remove $(MSSDK)\Lib it compiled and linked
Tuesday, May 20, 2008
Sample Grabber Problem
----------------------------
My client 's Callback function is executed successfully for the first time;After the execution of my call back function, I got the error;
Solution:
------------
1.It may be due to different types of call;2.that means My Sample Grabber application is in _stdcall modeand my client application is in _cdecl mode; if I modified my client application mode as _stdcall mode then my client application is running well without any problem;
The Sample grabber call back function must be in same type of calling mechanism;
check;Project->Settings -> Code Generation -> calling Convention ;
calculation NTP from RTCP
--------------------------------------------
ULONGLONG NTPts = (ULONGLONG) ( (MSB * 1000000) + (unsigned int)( (LSB/(65536*65536.0)) *1000000 ));
Thursday, May 08, 2008
RTP video and audio Timestamp Interval between RTP packets
------------------------------------------------------------
For Audio, the timestamp is incremented by the packetization interval times the sampling rate. For example, for audio packets containing 20 ms of audio sampled at 8,000 Hz, the timestamp for each block of audio increases by ((8000 / 1000) * 20) i.e. by 160.
For a 30 f/s video, timestamps would increase by (sample rate / 30) i.e. by 3,000 for each frame. ( SampleRate is 90000 Hz for video).If a frame is transmitted as several RTP packets, these packets would all bear the same timestamp.
ts_unit is equal to the above ...
AMR Muxer for RTP
--------------------
1.Input : AMR File
2.Output : AMR RTP Packet Data ( which contains 500 ms of data..
25 Frames * 20 ms)
Read 25 Frames and make it as like RTP amr data as follows:
i) F0,25 frames header and AMR Data
ii)
I have to test the RTP Muxer completely;
RTP Muxer is having problem;
RTP Demuxer will do the following:
1.F0 BC BC BC ... 3C
2.Number of BC's indicate the number of frames + 1 is number of frames in a RTP packet;
RTP muxer will have 24 BCs and 1 3C
About this BC value;
we can generate this value according the selected type:
3C refers 12.2 kbps
0111
1 0111 1 00
F FT Q 00
1 0111 1 00
F indicates the continuation of another Speech frame;
if F is 0 means that is the last speech frame;
After BC's, 3C is the last speech frame;
the Upper most bit is zero , so it is the last frame;
1 indicates continuation;
In case of 3C just check it...
0 0111 1 00
RTP Stack is sending the data correctly;
_ Working welland I have received the data properly
Wednesday, May 07, 2008
Execution Time calculation and Seconds calculation
Execution Time calculation and Seconds calculation:
---------------------------------------------------------
dwStart = timeGetTime();
Processing();
dwStop = timeGetTime();
dwStop - dwStart is a processing time in milliseconds;
long millis = 143789;
long minutes = millis / 60000;
long seconds = (millis % 60000) / 1000;
How can we sync audio and video in RTP
-----------------------------------------
1.Receive RTP video and RTP audio in a separate filter
we will get RTCP packets for N number of RTP packets constantly;
For RTP video, we will get RTCP NTP timestamp and RTP timestamp; and at the same time, we will get the RTP packet and
Extract it Timestamp;
NTP - is network time; it is used as a reference clock;
NTP is common to audio and video; we are adjusting avideo based on NTP and adjusting audio based on NTP; So it makes them be in sync;
Do the calculation as follows:
Video Timestamp = (RTP Timestamp of RTCP video packet / NTP timestamp of RTCP video packet) * RTP video Packet's RTP timestamp;
Audio Timestamp = (RTP Timestamp of RTCP Audio packet / NTP timestamp of RTCP Audio packet) * RTP audio Packet's RTP timestamp;
RTCP is sent +1 port of the RTP;
RTP Video Port number = 5000;
RTCP video Port number = RTP Video Port number + 1; ( 5001)
RTP audio Port number = 6000;
RTCP audio Port number = RTP audio Port number + 1; ( 6001 )
Knowledge Base about PCM and Waveaudio
------------------------------------------
Wave file is having 45 bytes of audio Header info. and rest of the things are audio data;
In case of PCM samples,
Each and every audio sample is having 20 ms of data; 20 ms of data will be represented in 320 bytes;
So Every PCM sample is having 320 bytes of data;
Find out if the PCM audio source is giving 8000 bytes of data, then Howmany milliseconds of data PCM audio source is giving;
PCM Audio Source(PC camera's Mic) is giving 8000 bytes of data at a time;8000 bytes of data means,
8000/320 ( for a single audio frame) = 25 audio frames at a time;
Each audio frame represents 20 ms of audio data;
25 frames* 20 ms = 500 ms;
The Audio soucre is giving 500 ms of data at a time;
Monday, May 05, 2008
Using Directshow Filters in XTL Test
Here's a little trick you can do with the Ball filter, once you've added seeking functionality. Paste the following into a text file named ball.xtl:
Change the string "Example.avi" to the name of a video file on your system. Now compile the XtlTest sample that comes with the Microsoft DirectX® 8.1 SDK, and run it from the command line:
xtltest ball.xtl
You will see the ball bounce in front of the video clip. This XTL file creates a DirectShow Editing Services project with two video tracks. One track contains the video from the source file, and the other uses the Ball filter to generate the video. The XTL file also defines a chroma key transition, which composites the two video images. The chroma key uses the Ball filter's black background as the key color. This works because DirectShow Editing Services can use a source filter as a source clip, if the filter supports seeking.
Saturday, April 26, 2008
ffmpeg rtp and rtcp audio video problem
the reason is ffmpeg video 4999 audio 5000
But I am not getting proper audio at client side; I got some glitches .
Reason :
---------------
1.we have to choose the even number of rtp port;rtcp packets will be sent by the ffmpeg ( rtp port + 1) = rtcp port number;
2. rtp video is sent in 4999 and ffmpeg will send the rtcp info at 5000
3.At the same time ffmpeg will sends audio at RTP port 5000 and RTCP at 5001;
4.So FFMPEG is sending the RTP and RTCP packets in a same Port that is 5000;
Monday, April 21, 2008
Source Filter's Output Pin Doesnt match any known format type
---------------------------------------------------------------------------------------------
I developed the Dshow Player to render the Ball filter;
CoCreateInstance() fn is working fine ;
I get the Output Pin of the ball filter and render it;
Rendering is failed;I checked the Ball Filter's Output Pin with Different Video types like VideoInfo, VideoInfo2,MPEG1Video ,MPEG2Video and DVVideo and none of them is matching to the media type of the source filter;
Solution:
-----------
Reason is silly. I got this error due to I am not adding the Ball Source Filter's IBaseFilter object to the FilterGraph;
So I just added the IBaseFilter Object to the FilterGraph leads to working graph;
Problems Faced in registering the Filter DLL in Windows CE
-----------------------------------------------------------
I faced the problem in registering the DLL ;
Solution:
-----------
There is no such file to register the DLL in Windows Mobile Emulator evnvironment like regsvr32 ;
So I developed the my own version of RegSvr32 application with the help of MSDN KB Code;
My Register application will do the following:
1.Loads the DLL
2.GetProcAddress() fn to get the address of the "DllRegisterserver" and call it;
3.Unloads the DLL;
Now it was registered in to the registry;
Because the .def file is not included in DLL project input; what it means was
within this .def file we specified the DLL functions as follows:
DllRegisterServer
DllUnRegisterServer
Then only we will get the correct address of the DllRegisterServer fn. otherwise we will get Null pointer;
One More problem I faced while calling the DllRegisterServer() fn address from My Register application:
The application was hanged and not giving any error, I got the execption within that fn;
I am not able to figure out the problem;
Solution:
-------------
So I checked the MSDN documentation as follows:
For Windows CE,we need to do the following:
STDAPI DllRegisterServer()
{
return AMovieDLLRegisterServer( );// Windows Desktop- AMovieDLLRegisterServer2(TRUE) - these are not supported in WinCE
}
STDAPI DllUnregisterServer()
{
return AMovieDLLUnRegisterServer( ); //Windows desktop-AMovieDLLRegisterServer2(FALSE)to unregister-these are not supported in WinCE
}
However, within DllRegisterServer and DllUnregisterServer you can customize the registration process as needed.
One more thing we need to
CFactoryTemplate g_Templates[] =
{
{
g_wszName, // Name.
&CLSID_SomeFilter, // CLSID.
CSomeFilter::CreateInstance, // Creation function.
NULL,
&sudFilterReg // Pointer to filter information.
}
};
int g_cTemplates = sizeof(g_Templates) / sizeof(g_Templates[0]);
LPAMOVIESETUP_FILTER CSomeFilter::GetSetupData() //we must implement this one for Windows CE
{
return (LPAMOVIESETUP_FILTER) & sudFilterReg;
}
Saturday, April 19, 2008
Windows CE Directshiw Filter Error
-------------------------------------
I developed the Ball Source filter in Windows CE;
I got these errors:
Creating library Pocket PC 2003 (ARMV4)\Release/Ball_Filter.lib and object Pocket PC 2003 (ARMV4)\Release/Ball_Filter.exp
1>fball.obj : error LNK2019: unresolved external symbol "public: __cdecl CSourceStream::CSourceStream(wchar_t *,long *,class CSource *,wchar_t const *)" (??0CSourceStream@@QAA@PA_WPAJPAVCSource@@PB_W@Z) referenced in function "public: __cdecl CBallStream::CBallStream(long *,class CBouncingBall *,wchar_t const *)" (??0CBallStream@@QAA@PAJPAVCBouncingBall@@PB_W@Z)
1>fball.obj : error LNK2001: unresolved external symbol "protected: virtual long __cdecl CSourceStream::QueryId(wchar_t * *)" (?QueryId@CSourceStream@@MAAJPAPA_W@Z)
1>fball.obj : error LNK2019: unresolved external symbol "public: __cdecl CSource::CSource(wchar_t *,struct IUnknown *,struct _GUID)" (??0CSource@@QAA@PA_WPAUIUnknown@@U_GUID@@@Z) referenced in function "private: __cdecl CBouncingBall::CBouncingBall(struct IUnknown *,long *)" (??0CBouncingBall@@AAA@PAUIUnknown@@PAJ@Z)
1>fball.obj : error LNK2001: unresolved external symbol "public: virtual long __cdecl CSource::FindPin(wchar_t const *,struct IPin * *)" (?FindPin@CSource@@UAAJPB_WPAPAUIPin@@@Z)
1>fball.obj : error LNK2001: unresolved external symbol "public: virtual long __cdecl CBaseFilter::JoinFilterGraph(struct IFilterGraph *,wchar_t const *)" (?JoinFilterGraph@CBaseFilter@@UAAJPAUIFilterGraph@@PB_W@Z)
1>fball.obj : error LNK2001: unresolved external symbol "public: virtual long __cdecl CBaseFilter::QueryVendorInfo(wchar_t * *)" (?QueryVendorInfo@CBaseFilter@@UAAJPAPA_W@Z)
1>Pocket PC 2003 (ARMV4)\Release/Ball_Filter.dll : fatal error LNK1120: 6 unresolved externals
Solution Steps:
--------------------
I modified the things in "Release" Mode:
it may be due to calling convention mechanism;
But ARM doesnt have calling convention compiler option( /Gr, Gd,Gz);
Calling convention is applicable only to x86 machines now I am using ARM;
I doubt whether the Strmbase.lib is built in x86; if it is so means, Previously I developed the dshow Player application
with this same "strmbase.lib"; So there is nothing wrong in lib;
I closely watched the all the error fns; they are all contains wchar_t;
It may due to some unicode problem?
Solution:
----------
Project->Properties->C++ ->Language ->
Treat wchar_t as Built In Type:Yes
I just modified this property to No;
Treat wchar_t as Built In Type: No
Now all the linker errors are resolved;
Now I am not having any error;
Note:
---------
Even though I did all the above things in Debug Mode, I got the following error:
Creating library Pocket PC 2003 (ARMV4)\Debug/Ball_Filter.lib and object Pocket PC 2003 (ARMV4)\Debug/Ball_Filter.exp
fball.obj : error LNK2001: unresolved external symbol "public: virtual unsigned long __cdecl CBaseFilter::NonDelegatingRelease(void)" (?NonDelegatingRelease@CBaseFilter@@UAAKXZ)
Pocket PC 2003 (ARMV4)\Debug/Ball_Filter.dll : fatal error LNK1120: 1 unresolved externals
For Debug Mode:
I have added this code within the CSource derived class;
public:
STDMETHODIMP_(ULONG) NonDelegatingAddRef()
{
return InterlockedIncrement(&m_cRef);
}
//-----------------------------------------------------------------------------
// Decrement the reference count for an interface
// ////////////////////////////////////////
STDMETHODIMP_(ULONG) NonDelegatingRelease()
{
if(InterlockedDecrement(&m_cRef) == 0)
{
delete this;
return 0;
}
return m_cRef;
}
Now it is working fine;
I have successfully build the filter code;
Friday, April 18, 2008
Audio and Video Port Problem in Audio Testing
=================
1.I combined the multiple RTP packets in to a Single packet and passed it to the AMR Paser application;
2.Next added the AMR file extension and AMR start codes and try to play it with QT Player;
Created AMR file is wroking with QT Player; It is working.
3.Pls kindly check the Market Bit of the RTP header format;
if we are sending the audio and video thru INVC transcoder,I printed the RTP packet Number as follows:
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 60, Buffer size = 1389, Marker = 0
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 61, Buffer size = 1389, Marker = 0
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 62, Buffer size = 1389, Marker = 0
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 6, Buffer size = 28, Marker = 128
RTP packet Number = 63, Buffer size = 1389, Marker = 0
ffmpeg sends amr audio to RTP.I got the following:
RTP packet Number = 3, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 3
RTP packet Number = 4, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 4
RTP packet Number = 5, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 5
RTP packet Number = 6, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 6
RTP packet Number = 7, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 7
RTP packet Number = 8, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 8
RTP packet Number = 9, Buffer size = 1389, Marker = 0 HSB = 0 , LSB= 9
while sending audio and video we have the problem;One more thing if we send audio and video, QuickTime player is able to render the audio.
So without the packet number we need to do parsing;
Every Audio Frame Beginning is marked by the Marker bit as Zero;otherwise it will be 1 or TRUE;
If I developed the amr parser for this scenario, It causes the Screech Sound in audio with particular interval.At client side, I received the RTP packets and Dumped it into a file also givesthe same problem;
Solution:
-----------------
Finally I got the Solution:
we are sending video data at 5000 Port We are sending audio data at 5001 Port ;So it causes the problem;
Instead of it If I changed the things as
video as 4500 and Audio as 5000 means there will be no problem So Every RTP amr packet is having header Info;
For Audio and video There must be some difference.So that we can avoid this situation;
Thursday, April 03, 2008
Two types of AMR:
-----------------------
1.AMR or AMR_NB ( AMR narrow band) - Mono channels ( 8 bit audio)
2.AMR_WB ( AMR wide band) - stereo channels ( 16 bit audio)
we are using AMR_NB ;
Every AMR file begins with following Hex start codes:
23 21 41 4d 52 0a ( Equal string is: #!AMR.)
if we open the amr file, without these start codes means it will display like
---------------------------
Error
---------------------------
Error -2048: Couldn't open the file Quotes.amr because it is not a file that QuickTime understands.
---------------------------
OK
---------------------------
Every audio frame is having one byte frame header;
format is as follows:
---------------------------------------------
F ( 1 bit) | FT( 4 bits) | 1 ( 1bit)
---------------------------------------------
Frame Type | Mode | Frame Size(Including 1 byte header) |
0 | AMR 4.75 | 13 |
1 | AMR 5.15 | 14 |
2 | AMR 5.9 | 16 |
3 | AMR 6.7 | 18 |
4 | AMR 7.4 | 20 |
5 | AMR 7.95 | 21 |
6 | AMR 10.2 | 27 |
7 | AMR 12.2 | 32 |
if we have a file with 12.2 kbps means, we can check this by the frame header;
and moreover we can note that every frame have 1 byte header;But it is same as start. Assume that first frame's header is 3C then every frame must begins with header start code ;
if the header is having proper start code and the next it will check whether every frame is having same header info; otherwise it will reports the error as follows:
"Quicktime media file is having incorrect duration"
I have to do one more check;
if every frame is having different header start code whether the QuickTime play that AMR file ?...
3C is a frame header for first frame;
38 is a frame header for second frame;
Both frame headers indicate the same Frame Type;
whether it will works ?
Result :
----------
it works fine;
Test case 2:
-------------------
Instead of 0x38, I inserted the 0x20.
Then I didnt get any sound while playing it in QuickTime player and moreover the Quicktime is not giving any errors;
Instead of 0x20, I made it as 0x20, then I faced the same problem.
Every AMR frame is having 20 ms of data;
So if 40 frames are available means
40 * 20 = 800 milli seconds( play time of that frames);
1000 milliseconds = 1 second
AMR over RTP:
RTP Packet Size 1389
RTP header size 12
--------
1377
-----------
(1377 / Frame Size) - 1 = Number of ToC Entries; each frame has ToC entry in RTP
Or just count the number of bytes (ToC entries) to identify the frames available in a RTP packet;
How can we extract the AMR mode information from the RTP packet?
First check the CMR (Codec Mode Request) .
It is having bandwidth. For Each bandwidth, the frame size is fixed;
CMR Mode Frame size (bytes)
0 AMR 4.75 13
1 AMR 5.15 14
2 AMR 5.9 16
3 AMR 6.7 18
4 AMR 7.4 20
5 AMR 7.95 21
6 AMR 10.2 27
7 AMR 12.2 32
Every AMR frame is having 20 ms of audio data;
So if 40 frames are available means
40 * 20 = 800 milli seconds (play time of that frames);
1000 milliseconds = 1 second
From Source Filter, Based on Number of Frames we need to set the Start and Stop timestamps.
AMR over RTP is as follows:
+----------------+-------------------+----------------
| payload header | table of contents | speech data ...
+----------------+-------------------+----------------
Payload header is 4 bits;
First 4 bits are CMR (Codec Mode Request); From Codec Mode Request value, we can identify the frame size;
ToC (Table of contents) -
Each and every frame has an entry for the ToC;
If RTP packet is having 43 audio frames means that much of Toc Bytes must be available.
After F0, Remove the same types of bytes; (this byte indicates the audio bit rate information)
But How can we know the Frame Type?
1011 1100 (BC) 12.2 kbps
1011 0100 (B4) 10.2 kbps
1010 1100 (AC) 7.95 kbps
RTP packet:
---------------
If a terminal has no preference in which mode to receive, it SHOULD set CMR=15 in all its outbound payloads
Each RTP AMR data begins with F0 and then ToC entries like 0xac, 0xbc, 0xb4 as repetitive.
If the RTP packet has N frames, RTP packet is having N number of TOC Entries.
From the TOC Entry we can define the frame size of the audio frame;
TOC Entry will be in the following form:
---------------------------------------------
F (1 bit) | FT (4 bits) | 1 (1 bit)
---------------------------------------------
1 0111 1 00 -12.2 kbps ( 0x BC)
1 0110 1 00 -10.2 kbps ( 0x B4)
1 0101 1 00 - 7.95 kbps ( 0x AC)
------------------------------------------------
FO BC BC BC BC BC BC BC BC BC 3C
After the TOC contents, the First start code acts as a frame header;
3C is the frame header available in an audio frame and every audio frame must begins with 3C;
From the bit rate, we can determine the Frame Size;
CMR Mode Frame size (bytes)
0 AMR 4.75 13
1 AMR 5.15 14
2 AMR 5.9 16
3 AMR 6.7 18
4 AMR 7.4 20
5 AMR 7.95 21
6 AMR 10.2 27
7 AMR 12.2 32
Frame Size is including a frame header;
But RTP packet is having the frames as follows:
First Frame alone has the 1 byte audio frame header; rest of the frames will not have header; we need to add it manually;
Ac 12 20 39 40 29 20 39 33
Ac is a frame header and from the header onwards we can identify the number of bytes per frame; assume that if the frame header info as 4.75kbps having frame size 13 means from the frame header, count the 13 bytes; then next insert the First frame’s frame header and count the 13 bytes from the header and then insert the frame header for 3rd frame;
1st Frame Header bytes After 13 bytes insert the First Frame’s header, then next insert the frame header after the 26 bytes and next insert the frame header after the 39 bytes. Do it repeatedly.
If we have not inserted the frame header at the every frame start, then it will be decoded by the AMR decoder but u will not have any hearable audio.
Wednesday, April 02, 2008
How to stream AMR to RTP and Play it in QuickTime
it plays the audio in Quicktime.
AMR.SDP File contents:
-----------------------
v=0
o=Mass 3123312 121232 IN IP4 192.168.1.198
s=Rtsp Session
c=IN IP4 192.168.1.198
t=0 0
a=range:npt=0-
m=audio 5000 RTP/AVP 97
a=rtpmap:97 AMR/8000
a=fmtp:97 octet-align=1
Thursday, March 27, 2008
Important notes
------------------------------------
When the Filter Graph Manager tries to connect two filters, the pins must agree on various things. If they cannot, the connection attempt fails. Generally, pins negotiate the following:
1.Transport
2.Media type
3.Allocator
Transport:
------------
The transport is the mechanism that the filters will use to move media samples from the output pin to the input pin. For example, they can use the IMemInputPin interface ("push model") or the IAsyncReader interface ("pull model").
Media type:
------------
Almost all pins use media types to describe the format of the data they will deliver.
Allocator:
--------------
The allocator is the object that creates the buffers that hold the data. The pins must agree which pin will provide the allocator. They must also agree on the size of the buffers, the number of buffers to create, and other buffer properties.
The base classes implement a framework for these negotiations. You must complete the details by overriding various methods in the base class. The set of methods that you must override depends on the class and on the functionality of your filter.
Processing and Delivering Data:
------------------------------------
The primary function of most filters is to process and deliver media data. How that occurs depends on the type of filter:
A push source has a worker thread that continuously fills samples with data and delivers them downstream.
A pull source waits for its downstream neighbor to request a sample. It responds by writing data into a sample and delivering the sample to the downstream filter. The downstream filter creates the thread that drives the data flow.
A transform filter has samples delivered to it by its upstream neighbor. When it receives a sample, it processes the data and delivers it downstream.
A renderer filter receives samples from upstream, and schedules them for rendering based on the time stamps.
Tuesday, March 25, 2008
RTP audio filter problem in latency...
2.Today we improved the audio quality. By Putting IAmPushSource and set the latency as
GetLatency(){ *plLatency = 3500000; return S_OK;}
I have implemented the IAMPushSource on My live source Filter's Output Pin.
I faced 2 to 3 seconds latency problem;
So I modified the things as follows:
STDMETHODIMP CRTPAudioStream::GetPushSourceFlags(ULONG *pFlags)
{
*pFlags = AM_PUSHSOURCECAPS_PRIVATE_CLOCK;
//The filter time stamps the samples using a private clock.
//The clock is not available to the rest of the graph through IReferenceClock.
return S_OK;
}
Now everything works fine within the Filter, we implemented as follows:
class CRTPFilter: public CSourceStream, public IAMFilterMiscFlags
{
public: // IAMFilterMiscFlags override
ULONG STDMETHODCALLTYPE GetMiscFlags(void)
{ return AM_FILTER_MISC_FLAGS_IS_SOURCE; }
};
class CRTPAudioStream : public CSourceStream,public IAMPushSource
{
public:
//IAMPushSource
STDMETHODIMP GetMaxStreamOffset(REFERENCE_TIME *prtMaxOffset)
{return E_NOTIMPL;}
STDMETHODIMP GetPushSourceFlags(ULONG *pFlags)
{ *pFlags = AM_PUSHSOURCECAPS_PRIVATE_CLOCK;return S_OK;}
STDMETHODIMP GetStreamOffset(REFERENCE_TIME *prtOffset)
{return E_NOTIMPL;}
STDMETHODIMP SetMaxStreamOffset(REFERENCE_TIME rtMaxOffset)
{return E_NOTIMPL;}
STDMETHODIMP SetPushSourceFlags(ULONG Flags)
{return E_NOTIMPL;}
STDMETHODIMP SetStreamOffset(REFERENCE_TIME rtOffset)
{return E_NOTIMPL;}
STDMETHODIMP GetLatency(REFERENCE_TIME *prtLatency)
{ * prtLatency = 45000000; return S_OK;} //Set 450ms Latency for audio
};
Saturday, March 15, 2008
CSource Filter architecture with memory mapped file
within the source filter, we have to read data from the memory mapped file.From another one application, we have to read data from network and write it to the memory mapped file.But Both the Source Filter and Network receiving application must be resides in the same system. if we did like this, then we can render the received audio and video data without any problem. we can also queue the recceived network packets.
I have seen the application for the following: Source Filter renders the data in a memory mapped file.
1. Camera Source Filter to render the data from the memory mapped file.
2. From another application , it creates the Filter Graph with source Filter and implemented the Thread to receive packets from the network and Queueing it in a datastructure.