Showing posts with label Directshow. Show all posts
Showing posts with label Directshow. Show all posts

Friday, April 10, 2009

How to handle seek operation in Splitter or Source Filter ?

How to handle seek operation in Splitter /Source Filter:
-----------------------------------------------------------
1.SetPositions () called multiple times for a Single Seek operation.
2.we have to Set m_bSeeked variable in SetPositions() and Start Seek position
3.We have to store the Previous Seek value.within FillBuffer(), if m_bSeeked set,check for the seeked position and Previous seek position. if they are different, then send a seek request tostack or file parser.
So even Multiple SetPositions () ( 3 times) called for a single seek, operation from our filter we will send a seek request onceby having seeked Flag and Checking the Previous seek position with current seek position.
During Seeking, FillBuffer must not wait for any event... it must be unblocked ... ( Set Positions () fn will call flushing in all the output pins. Flushing will wait for the FillBuffer to complete its operation.

Sunday, March 29, 2009

Audio goes off after seeking several times

Audio goes off after seeking several times :

When I seek several times, the audio goes off.

Reason:

After seeking, negative timestamps are set to video and audio media samples.

In case of video even though it was negative, it was rendered.But In case of audio,

it was not rendered fine because the audio is the reference clock for both video and audio.

Moreover audio and video are synchronized with this audio timestamp.

So the audio timestamp must not be negavtive, after seeking. if it is so, we cant predict the

rendering ( Sometimes it might render and sometimes it may not render)

Solution:

I modified Source filter in such a way to output positive timestamp for video and audio after seeking. Now the issue was resolved.

Monday, August 04, 2008

How to replace our video renderer with Microsoft's video renderer?

if we are developing our own renderer, then we have to make the media player to use our own
video renderer filter. what we have to do for it?
[HKEY_CLASSES_ROOT\CLSID\{70e102b0-5556-11ce-97c0-00aa0055595a}]
@="Video Renderer"
"Merit"=dword: 0x800000
and check its merit ;

Make our own Video Renderer filter's merit as follows
0x800000 + 1
Then Windows media player make use of our own video renderer filter;
Dshow loads filters based on merit; if the merit is high, then it will be preferred by the windows media player as well .
as Intelligent connect for connecting filters in a filtergraph.

Wednesday, July 09, 2008

Camera Driver Pin resolution problem


Camera Driver Pin resolution problem:
---------------------------------------------------------
1.Capture pin resolutions must be supported in Preview Pin Resolution;
if Capture Pin resolutions are not matched with the Preview Pin Resolution;
Cause:
--------
This is the bug in camera or capture driver;Capture Pin resolution must be supported in Preview Pin resolution;

Monday, July 07, 2008

Observation about Less than 1 Sec Video

What I was Done:
------------------
1.I have Queried the GetCurrentPosition () fn from the Player Application;
TIME_FORMAT_MEDIA_TIME flag is set as same as our Filters;
30000 is the Maximum Duration means,if we set the 30001 will set the end of the file;



TrackType =0, Start : 0,Stop :990000
TrackType =0, Start : 990000,Stop :1990000
TrackType =0, Start : 1990000,Stop :3050000
TrackType =0, Start : 3050000,Stop :3990000
TrackType =0, Start : 3990000,Stop :4990000
TrackType =0, Start : 4990000,Stop :5980000
TrackType =0, Start : 5980000,Stop :6980000
Track Type:0 Track Duration is 798,Total Duration is:798
VideoTrack Reaches End: 8380000
TrackType =0, Start : 6980000,Stop :8380000
Stream Time is :7980000 Set it as Properly;
m_nCurrentPos :7980000 for the Last Media Sample we set the TimeStamp as 8380000;
It is more than the Timestamp;

I have used the Player Application there I get the total duration of the media file;
and moreover I printed the current Position using IMediaSeeking interface;
This is the Problem with multimedia file having less than 1 seconds of video and audio;
if I am adding the 200 milli seconds then it works fine;

Tuesday, July 01, 2008

Problem while Playing the video less than one second

Problem while Playing the video less than one second:
-----------------------------------------------------
if the video is having less than one second, media player seek bar never reaches the end position;
I have tested the video which is having less than 8 seconds with
AAC Encoded audio;QCELP Encoded audio,AMR Nb Encoded audio in a 3GP file;
I got the Same Error as follows:

For the video less than 1 seconds:
-----------------------------------
GetDuration : 7930000
End Stop value of Media Sample is : 7190001
7190001 / 7930000 = 0.90 So the Trackbar is not shown at the end of the media player




For Video > 1 seconds:
-------------------
GetDuration : 172400000
End Stop Value of Media Sample is : 172200001
172200001 /172400000 = 0.99 So the Error is not shown clearly in this large video;
The Track bar never reaches the end position of the media player;
So the Problem is with the Timestamp calculation;

No Time stamp has been set for this sample Error

I got the "No Time stamp has been set for this sample" while playing the 3gp file;
-----------------------------------------------------------------------------------
HRESULT FillBuffer()
{
HRESULT hr;
hr = GetMP4Sample (pSample);
if(SUCCEEDED(hr))
{
REFERENCE_TIME rtStart = 0, rtStop = 0;
hr = pSample->GetTime(&rtStart, &rtStop);
if(SUCCEEDED(hr))
{
wchar_t szMsg[MAX_PATH];
swprintf(szMsg,L"\n 0-vid,1- audio,TrackType = %d, Start = %ld, Stop : %ld",m_nTrakType,(LONG)rtStart,(LONG) rtStop);
OutputDebugString(szMsg);
}
}
return hr;

}



Solution:
===========
the above code displays the "No timestamp has been set for this sample " error;
Reason is
if GetTime() is failed, then that hr value will be returned from the
FillBuffer() fn; So I got this error; if I modified it as follows, then I wont get an error;

hr = GetMP4Sample (pSample);
if(SUCCEEDED(hr))
{
REFERENCE_TIME rtStart = 0, rtStop = 0;
HRESULT temphr = pSample->GetTime(&rtStart, &rtStop);
if(SUCCEEDED(temphr))
{
wchar_t szMsg[MAX_PATH];
swprintf(szMsg,L"\n 0-vid,1- audio,TrackType = %d, Start = %ld, Stop : %ld",m_nTrakType,(LONG)rtStart,(LONG) rtStop);
OutputDebugString(szMsg);
}
}

Record the video and audio using PIMG with desired video and audio encoder

Record the video and audio using PIMG with desired video and audio encoder:
----------------------------------------------------------------------------

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Pictures\Camera\OEM\AudioEncoderClassID as AAC Encoder's ClassID
\VideoProfile\1\AudioEncoderCLassID
\VideoProfile\2\AudioEncoderClassID
\VideoProfile\3\AudioEncoderClassID;
\VideoEncoderClassID
\MuxerClassID

{f0db0b53-53ff-49b9-8c51-4ed44cec6d46} - AAC Encoder
just copy our encoder DMO class ID at the above specified registry location;
The Pimg ( Microsoft's PIMG pictures and videos) record , encode and multiplex the video according to the OEM registry settings;
I have tested this behavior in Motorola; But some mobiles may not implement this behavior; Example : Asus;
By Default if it is recording wmv file means, those informations will be available in this registry location;

Friday, June 27, 2008

Dshow Camera capture application Problem in Windows Mobile

Dshow Camera capture application Problem in Windows Mobile:
-----------------------------------------------------------
pimg.exe (Pictures and videos) behavior:
-----------------------------------------
if we are writing camera capture application, whenever the user hits Home button, the application must release the
Video and audio Capture Source; whenever the back button is pressed (if available) or Thru Task manager they switch to the Camera application ,
the camera capture application must starts its execution normally (render video from video source);
It makes use of the video and audio capture filters;
So Every camera capture application must follow this behavior;
I am writing the camera capture application, whenever the user hits the Home Button, the application focus is changed;
So The Windows Mobile OS will send the following messages sequentially;
1.WM_ACTIVATE 's WA_INACTIVE
2.WM_ACTIVATE 's WA_ACTIVE

So whenever the user hits the Home button, the WM_ACTIVATE message with WA_INACTIVE fired.
Steps:
-------
1.Home Button was pressed ( that indicates WM_ACTIVATE's WA_INACTIVE),Release the video and audio capture filters in WM_ACTIVATE's WA_INACTIVE and
set the boolean variable bInActive as TRUE;
2.When Back button was pressed, The camera application was focused; But it didnt get any message like focus or Activate app or WM_ACTIVATE message;
So what I did was, whenever our camera application got focus, it must paint the camera application window;
So within WM_PAINT message, I will check if bInActive flag was set, then CreateInstance for the video capture filter and audio capture filter and Starts rendering video;

Dshow Camera capture application Problem in Windows Mobile


Dshow Camera capture application Problem in Windows Mobile:
-----------------------------------------------------------
pimg.exe (Pictures and videos) behavior:
-----------------------------------------
if we are writing camera capture application, whenever the user hits Home button, the application must release the
Video and audio Capture Source; whenever the back button is pressed (if available) or Thru Task manager they switch to the Camera application ,
the camera capture application must starts its execution normally (render video from video source);
It makes use of the video and audio capture filters;
So Every camera capture application must follow this behavior;
I am writing the camera capture application, whenever the user hits the Home Button, the application focus is changed;
So The Windows Mobile OS will send the following messages sequentially;
1.WM_ACTIVATE 's WA_INACTIVE
2.WM_ACTIVATE 's WA_ACTIVE

So whenever the user hits the Home button, the WM_ACTIVATE message with WA_INACTIVE fired.
Steps:
-------
1.Home Button was pressed ( that indicates WM_ACTIVATE's WA_INACTIVE),Release the video and audio capture filters in WM_ACTIVATE's WA_INACTIVE and
set the boolean variable bInActive as TRUE;
2.When Back button was pressed, The camera application was focused; But it didnt get any message like focus or Activate app or WM_ACTIVATE message;
So what I did was, whenever our camera application got focus, it must paint the camera application window;
So within WM_PAINT message, I will check if bInActive flag was set, then CreateInstance for the video capture filter and audio capture filter and Starts rendering video;

Tuesday, May 20, 2008

Sample Grabber Problem

Sample Grabber Problem:
----------------------------
My client 's Callback function is executed successfully for the first time;After the execution of my call back function, I got the error;
Solution:
------------
1.It may be due to different types of call;2.that means My Sample Grabber application is in _stdcall modeand my client application is in _cdecl mode; if I modified my client application mode as _stdcall mode then my client application is running well without any problem;
The Sample grabber call back function must be in same type of calling mechanism;
check;Project->Settings -> Code Generation -> calling Convention ;

Monday, May 05, 2008

Using Directshow Filters in XTL Test

Using the Ball Filter in a DirectShow Editing Services Project
Here's a little trick you can do with the Ball filter, once you've added seeking functionality. Paste the following into a text file named ball.xtl:

width="320" height="240">




start="0" stop="5" />
start="0" stop="5">






Change the string "Example.avi" to the name of a video file on your system. Now compile the XtlTest sample that comes with the Microsoft DirectX® 8.1 SDK, and run it from the command line:
xtltest ball.xtl
You will see the ball bounce in front of the video clip. This XTL file creates a DirectShow Editing Services project with two video tracks. One track contains the video from the source file, and the other uses the Ball filter to generate the video. The XTL file also defines a chroma key transition, which composites the two video images. The chroma key uses the Ball filter's black background as the key color. This works because DirectShow Editing Services can use a source filter as a source clip, if the filter supports seeking.

Monday, April 21, 2008

Source Filter's Output Pin Doesnt match any known format type

Source Filter's Output Pin Doesnt match any known format type like Format_VideoInfo,Video2...
---------------------------------------------------------------------------------------------
I developed the Dshow Player to render the Ball filter;
CoCreateInstance() fn is working fine ;
I get the Output Pin of the ball filter and render it;
Rendering is failed;I checked the Ball Filter's Output Pin with Different Video types like VideoInfo, VideoInfo2,MPEG1Video ,MPEG2Video and DVVideo and none of them is matching to the media type of the source filter;
Solution:
-----------
Reason is silly. I got this error due to I am not adding the Ball Source Filter's IBaseFilter object to the FilterGraph;
So I just added the IBaseFilter Object to the FilterGraph leads to working graph;

Problems Faced in registering the Filter DLL in Windows CE

Problems Faced in registering the Filter DLL in Windows CE:
-----------------------------------------------------------
I faced the problem in registering the DLL ;
Solution:
-----------
There is no such file to register the DLL in Windows Mobile Emulator evnvironment like regsvr32 ;
So I developed the my own version of RegSvr32 application with the help of MSDN KB Code;
My Register application will do the following:
1.Loads the DLL
2.GetProcAddress() fn to get the address of the "DllRegisterserver" and call it;
3.Unloads the DLL;
Now it was registered in to the registry;








Because the .def file is not included in DLL project input; what it means was
within this .def file we specified the DLL functions as follows:
DllRegisterServer
DllUnRegisterServer
Then only we will get the correct address of the DllRegisterServer fn. otherwise we will get Null pointer;



One More problem I faced while calling the DllRegisterServer() fn address from My Register application:
The application was hanged and not giving any error, I got the execption within that fn;
I am not able to figure out the problem;


Solution:
-------------
So I checked the MSDN documentation as follows:
For Windows CE,we need to do the following:
STDAPI DllRegisterServer()
{
return AMovieDLLRegisterServer( );// Windows Desktop- AMovieDLLRegisterServer2(TRUE) - these are not supported in WinCE
}
STDAPI DllUnregisterServer()
{
return AMovieDLLUnRegisterServer( ); //Windows desktop-AMovieDLLRegisterServer2(FALSE)to unregister-these are not supported in WinCE
}
However, within DllRegisterServer and DllUnregisterServer you can customize the registration process as needed.
One more thing we need to
CFactoryTemplate g_Templates[] =
{
{
g_wszName, // Name.
&CLSID_SomeFilter, // CLSID.
CSomeFilter::CreateInstance, // Creation function.
NULL,
&sudFilterReg // Pointer to filter information.
}
};
int g_cTemplates = sizeof(g_Templates) / sizeof(g_Templates[0]);

LPAMOVIESETUP_FILTER CSomeFilter::GetSetupData() //we must implement this one for Windows CE
{
return (LPAMOVIESETUP_FILTER) & sudFilterReg;
}

Saturday, April 19, 2008

Windows CE Directshiw Filter Error

Windows CE Directshow Filter Error :
-------------------------------------
I developed the Ball Source filter in Windows CE;
I got these errors:
Creating library Pocket PC 2003 (ARMV4)\Release/Ball_Filter.lib and object Pocket PC 2003 (ARMV4)\Release/Ball_Filter.exp
1>fball.obj : error LNK2019: unresolved external symbol "public: __cdecl CSourceStream::CSourceStream(wchar_t *,long *,class CSource *,wchar_t const *)" (??0CSourceStream@@QAA@PA_WPAJPAVCSource@@PB_W@Z) referenced in function "public: __cdecl CBallStream::CBallStream(long *,class CBouncingBall *,wchar_t const *)" (??0CBallStream@@QAA@PAJPAVCBouncingBall@@PB_W@Z)
1>fball.obj : error LNK2001: unresolved external symbol "protected: virtual long __cdecl CSourceStream::QueryId(wchar_t * *)" (?QueryId@CSourceStream@@MAAJPAPA_W@Z)
1>fball.obj : error LNK2019: unresolved external symbol "public: __cdecl CSource::CSource(wchar_t *,struct IUnknown *,struct _GUID)" (??0CSource@@QAA@PA_WPAUIUnknown@@U_GUID@@@Z) referenced in function "private: __cdecl CBouncingBall::CBouncingBall(struct IUnknown *,long *)" (??0CBouncingBall@@AAA@PAUIUnknown@@PAJ@Z)
1>fball.obj : error LNK2001: unresolved external symbol "public: virtual long __cdecl CSource::FindPin(wchar_t const *,struct IPin * *)" (?FindPin@CSource@@UAAJPB_WPAPAUIPin@@@Z)
1>fball.obj : error LNK2001: unresolved external symbol "public: virtual long __cdecl CBaseFilter::JoinFilterGraph(struct IFilterGraph *,wchar_t const *)" (?JoinFilterGraph@CBaseFilter@@UAAJPAUIFilterGraph@@PB_W@Z)
1>fball.obj : error LNK2001: unresolved external symbol "public: virtual long __cdecl CBaseFilter::QueryVendorInfo(wchar_t * *)" (?QueryVendorInfo@CBaseFilter@@UAAJPAPA_W@Z)
1>Pocket PC 2003 (ARMV4)\Release/Ball_Filter.dll : fatal error LNK1120: 6 unresolved externals


Solution Steps:
--------------------
I modified the things in "Release" Mode:

it may be due to calling convention mechanism;
But ARM doesnt have calling convention compiler option( /Gr, Gd,Gz);
Calling convention is applicable only to x86 machines now I am using ARM;

I doubt whether the Strmbase.lib is built in x86; if it is so means, Previously I developed the dshow Player application
with this same "strmbase.lib"; So there is nothing wrong in lib;
I closely watched the all the error fns; they are all contains wchar_t;
It may due to some unicode problem?
Solution:
----------
Project->Properties->C++ ->Language ->
Treat wchar_t as Built In Type:Yes

I just modified this property to No;
Treat wchar_t as Built In Type: No

Now all the linker errors are resolved;
Now I am not having any error;


Note:
---------
Even though I did all the above things in Debug Mode, I got the following error:
Creating library Pocket PC 2003 (ARMV4)\Debug/Ball_Filter.lib and object Pocket PC 2003 (ARMV4)\Debug/Ball_Filter.exp
fball.obj : error LNK2001: unresolved external symbol "public: virtual unsigned long __cdecl CBaseFilter::NonDelegatingRelease(void)" (?NonDelegatingRelease@CBaseFilter@@UAAKXZ)
Pocket PC 2003 (ARMV4)\Debug/Ball_Filter.dll : fatal error LNK1120: 1 unresolved externals
For Debug Mode:
I have added this code within the CSource derived class;
public:
STDMETHODIMP_(ULONG) NonDelegatingAddRef()
{
return InterlockedIncrement(&m_cRef);
}


//-----------------------------------------------------------------------------
// Decrement the reference count for an interface
// ////////////////////////////////////////
STDMETHODIMP_(ULONG) NonDelegatingRelease()
{
if(InterlockedDecrement(&m_cRef) == 0)
{
delete this;
return 0;
}
return m_cRef;
}
Now it is working fine;

I have successfully build the filter code;

Thursday, March 27, 2008

Important notes

Negotiating Pin Connections:
------------------------------------
When the Filter Graph Manager tries to connect two filters, the pins must agree on various things. If they cannot, the connection attempt fails. Generally, pins negotiate the following:
1.Transport
2.Media type
3.Allocator
Transport:
------------
The transport is the mechanism that the filters will use to move media samples from the output pin to the input pin. For example, they can use the IMemInputPin interface ("push model") or the IAsyncReader interface ("pull model").

Media type:
------------
Almost all pins use media types to describe the format of the data they will deliver.
Allocator:
--------------
The allocator is the object that creates the buffers that hold the data. The pins must agree which pin will provide the allocator. They must also agree on the size of the buffers, the number of buffers to create, and other buffer properties.
The base classes implement a framework for these negotiations. You must complete the details by overriding various methods in the base class. The set of methods that you must override depends on the class and on the functionality of your filter.

Processing and Delivering Data:
------------------------------------
The primary function of most filters is to process and deliver media data. How that occurs depends on the type of filter:
A push source has a worker thread that continuously fills samples with data and delivers them downstream.

A pull source waits for its downstream neighbor to request a sample. It responds by writing data into a sample and delivering the sample to the downstream filter. The downstream filter creates the thread that drives the data flow.
A transform filter has samples delivered to it by its upstream neighbor. When it receives a sample, it processes the data and delivers it downstream.
A renderer filter receives samples from upstream, and schedules them for rendering based on the time stamps.

Tuesday, March 25, 2008

RTP audio filter problem in latency...

1.Yesterday we completed the Packet loss adjusting mechanism.
2.Today we improved the audio quality. By Putting IAmPushSource and set the latency as
GetLatency(){ *plLatency = 3500000; return S_OK;}
I have implemented the IAMPushSource on My live source Filter's Output Pin.
I faced 2 to 3 seconds latency problem;
So I modified the things as follows:
STDMETHODIMP CRTPAudioStream::GetPushSourceFlags(ULONG *pFlags)
{
*pFlags = AM_PUSHSOURCECAPS_PRIVATE_CLOCK;
//The filter time stamps the samples using a private clock.
//The clock is not available to the rest of the graph through IReferenceClock.
return S_OK;
}
Now everything works fine within the Filter, we implemented as follows:

class CRTPFilter: public CSourceStream, public IAMFilterMiscFlags
{
public: // IAMFilterMiscFlags override
ULONG STDMETHODCALLTYPE GetMiscFlags(void)
{ return AM_FILTER_MISC_FLAGS_IS_SOURCE; }
};

class CRTPAudioStream : public CSourceStream,public IAMPushSource
{
public:
//IAMPushSource
STDMETHODIMP GetMaxStreamOffset(REFERENCE_TIME *prtMaxOffset)
{return E_NOTIMPL;}
STDMETHODIMP GetPushSourceFlags(ULONG *pFlags)
{ *pFlags = AM_PUSHSOURCECAPS_PRIVATE_CLOCK;return S_OK;}
STDMETHODIMP GetStreamOffset(REFERENCE_TIME *prtOffset)
{return E_NOTIMPL;}
STDMETHODIMP SetMaxStreamOffset(REFERENCE_TIME rtMaxOffset)
{return E_NOTIMPL;}
STDMETHODIMP SetPushSourceFlags(ULONG Flags)
{return E_NOTIMPL;}
STDMETHODIMP SetStreamOffset(REFERENCE_TIME rtOffset)
{return E_NOTIMPL;}
STDMETHODIMP GetLatency(REFERENCE_TIME *prtLatency)
{ * prtLatency = 45000000; return S_OK;} //Set 450ms Latency for audio
};

Saturday, March 15, 2008

CSource Filter architecture with memory mapped file

For Network Source Filter, we are not able to receive and render the data efficiently,if we want to implement any Queue like mechanism to network received packets and render the data if we did all these things in a source filter., that will takes much CPU time. So we can do it different way as follows:

within the source filter, we have to read data from the memory mapped file.From another one application, we have to read data from network and write it to the memory mapped file.But Both the Source Filter and Network receiving application must be resides in the same system. if we did like this, then we can render the received audio and video data without any problem. we can also queue the recceived network packets.

I have seen the application for the following: Source Filter renders the data in a memory mapped file.
1. Camera Source Filter to render the data from the memory mapped file.
2. From another application , it creates the Filter Graph with source Filter and implemented the Thread to receive packets from the network and Queueing it in a datastructure.

Add Worker Thread in Source Filter derived from CSource class

How to create threads within CSource derived class.
within PushDesktop Source Filter,I created the thread. within this thread, I incremented the integer.within FillBuffer() fn, I displayed the integer value.
ThreadProc() {
i++; SetEvent( m_hEvent); }
FillBuffer() { WaitForSingleObject( m_hEvent);
}
the following code is not working ;
ThreadProc() {
while(true) {
dwWaitObject = WaitForSingleObject(m_hStopEvent, 5);
if(dwWaitObject == WAIT_OBJECT_0) { return 0; }
i++;
SetEvent( m_hEvent);
}
}
FillBuffer()
{
WaitForSingleObject( m_hEvent);
}
ThreadProc() {
i++; SetEvent( m_hEvent); }
FillBuffer() { WaitForSingleObject( m_hEvent);
}

At every one second, the video renderer waits for data from the source Filter. So if u modify thedwWaitObject = WaitForSingleObject(m_hStopEvent, 5); as
dwWaitObject = WaitForSingleObject(m_hStopEvent, 2); then it is working.
we must not take that much of time ...
Try with one more thing:
ThreadProc() { ThreadCB(); }
ThreadCB() { WaitForSingleObject(); printf("\n sundar");
}

Tuesday, March 11, 2008

RTP source Filter

RTP Source Filter :
---------------------
From RTP Source Filter, we will just call the APIs in RTP Stack DLL.
we called the RTP Stack DLL APIs as follows:

RTPReceiveVideo() ; if( HasMPEG4Frames() == TRUE)
{ GetMPEG4Frame(pbBufferdata,pbBufferSize,iTimestamp); }
within this RTP stack DLL, we implemented the RTP parser.
Normally only KeyFrames alone will be wrapped in more than one RTP packet.
So what we have done was if the frame is wrapped in more than one RTP packet, then will put it in some Queue.
if we received all the consecutive RTP packets until the end of that frame,then only we will return HasMPEG4Frames() as TRUE.otherwise we will discard that Packet.
Notes:
-----------
ffmpeg will sends every MPEG4 Encoded frame beginning in a new Packet.
if the Frame contained in more than one packets, assume that if the frame is in 3 RTP Packets. within the 3rd RTP packet it will notsend the Next Frame's data.it will be wrapped in 4th RTP packet.