Friday, February 29, 2008

How to set the Maximum Packet Size for the UDP Socket Buffer Size ?

How to set the Maximum Packet Size for the UDP Socket Buffer Size ?

-------------------------------------------------------------------------------
#define UDP_TX_BUF_SIZE 32768

#define UDP_MAX_PKT_SIZE 65536

// set udp recv buffer size to at least the largest possible udp packet

//size to avoid losing data on OSes that set this too low by default.

int optlen = sizeof(tmp);

if (!getsockopt(udp_fd, SOL_SOCKET, SO_RCVBUF, &tmp, &optlen) && ( tmp <>

{ tmp = UDP_MAX_PKT_SIZE;

setsockopt(udp_fd, SOL_SOCKET, SO_RCVBUF, &tmp, sizeof(tmp));

}

Thursday, February 28, 2008

Source Filter without Timestamp will works?

Source Filter without Timestamp will works ?
within FillBuffer() fn of a source Filter, I have not set the timestamp .
But it is working well.Timestamp is necessary for storing the media( audio and video ) content to a file.
without Setting timestamp for the media sample, the Source Filter is working well.

Wednesday, February 20, 2008

How can we extract the MPEG4 Frame Type from the RTP packet ?

How can we extract the MPEG4 Frame Type from the RTP packet (I,P or B):
------------------------------------------------------------------------
We are creating a software to parse MPEG4 data. Can anyone tell me how to extract the Frame Type (I,P,B) frame type from the whole RTP packet?
I mean I have RTP packet with MPEG4 payload in it, then how to extract the Frame type information from it?
Solution :
---------------
vop_coding_type is encoded in the first 2 bits after vop_start_code.
vop_start_code is 00 00 01 b6 (hex)
vop_coding_type means:
0 0 -I VOP
0 1 -P VOP
1 0 -B VOP
1 1 -S VOP

we are sending mpeg4 video thru rtp using ffmpeg.
Every I Frame( KeyFrame) is having VOS start code. 00 00 01 B0

Monday, February 18, 2008

How can we achive the Dynamic Buffer size for the output pin of a Filter?

How can we achive the Dynamic Buffer size for the output pin of a Filter?:
---------------------------------------------------------------------------
within DecideBufferSize() fn, we have to allocate the maximum amount of memory.( if we know) it means the outputPin's media sample size is same as DecideBufferSize() memory size.
DecideBufferSize() fn - allocates 4000 bytes. then Each output media sample is having 4000 bytes of memory.Assume that Frame1 is having only 3000 bytes.For the rest of the 1000 bytes we have to fill it with zeroes.
ZeroMemory() fn is available to set the zero for specified buffer size.

Frame1 Output Sample:
Transform(IMediaSample* pInms,IMediaSample* pOutms){
BYTE* pbInputData; DWORD dwInputSize;
BYTE* pbOutputData; DWORD dwOutputSize; pInms->GetPointer(&pbInputData); dwInputSize = pInms->Getsize(); //3000
//Frame1 have only 3000 bytes of data. pOutms->GetPointer(&pbOutputData); dwOutputSize = pOutms->GetSize();
ZeroMemory(pbOutputData,dwOutputSize); memcpy(pbOutputData,pbInputData,dwInputSize); // only 3000 bytes are copied, the rest of the 1000 bytes are 0 }
we can use this mechanism for Resize filter .
we can allocate memory for the maximum resolution for the output pin.
1900x1400x3 - is the maximum resolution. For the Rest of the bytes just Fill them with zeroes.

Register the Protocol in Directshow

Register the Protocol for the Filter:
--------------------------------------
The objective is if we opened the protocol URL in a GraphEdit,we need to insert our filter and capture the data.
Example :
----------
I opened the RTP protocol as in GraphEdit as follows in Render URL option:
rtp://127.0.0.1:5000/
it must insert the Source Filter.
it is not having the separate Source Filter.
I developed the Source Filter. while rendering the RTP, I want to insert my filter.
what we need to do ?
---------------------
Develop the Registry setup for the RTP protocol with our source Filter
RTP.reg contents :
------------------
REGEDIT4
[HKEY_CLASSES_ROOT\rtp]@="RTP : Realtime Transfer Protocol""EditFlags"=hex:02,00,00,00"URL Protocol"="""Source Filter"="{fd501041-8ebe-11ce-8183-00aa00577da1}"
[HKEY_CLASSES_ROOT\rtp\DefaultIcon]@="c:\\Program Files\\Windows Media Player\\wmplayer.exe,0"
[HKEY_CLASSES_ROOT\rtp\QuickView]@="*"
[HKEY_CLASSES_ROOT\rtp\Shell]@=""
[HKEY_CLASSES_ROOT\rtp\Shell\open]"EditFlags"=hex:01,00,00,00
[HKEY_CLASSES_ROOT\rtp\Shell\open\command]@="C:\\Program Files\\Windows Media Player\\wmplayer.exe %1"
By double clicking this file we can link our source Filter with the RTP protocol.But In the source Filter we need to implement the IFileSourceFilter interface.
What I have done ? I implemented the IFileSourceFilter in a Ball Source Filter and need to insert this filter whenever the the RTP protocol is rendered. Then within Load and GetCurFile() fn we got the control in a source Filter.
If I opened the rtp stream in a Windows media player 9 ( wmplayer.exe) rtp://127.0.0.1:4000/,it is not working. But if I opened the rtp stream in windows media player 6( mplayer2.exe), it inserts the The ball filter and renders the data in a player.

mplayer2.exe
=============
How can we run the .reg file and modify the registry ?.
.------------------------------------------------------
start rtp.reg


Output of the RTP source Filter :
-------------------------------------
MEDIATYPE : MEDIATYPE_Video
Compression : DIVX
Width : 176
Height : 144

ffmpeg rate emulation for realtime streaming

we sent the MPEG4 Encoded data thru ffmpeg in a RTP stream.At receiver side, we are not getting the correct number of frames.( Sequence Number of the Frames highly differ).
Like After the RTP Packet (Sequence) number 1526, I got the number as 3600 or something.
ffmpeg sends the Entire data at with 5 minutes( which can be streamed around 30 mins).So there might be the chance for TCP/IP stack overflow. It may causes some packet loss.
(Real Time) if we are sending the video data as same as stream time,then there will be no packet loss. So we send the data to RTP stream using rate emulation ( -re ) command in ffmpeg.
Eventhough we are sending data with this one, we got the packet loss problem.
Check it with the Ethreal packet capture.
Capture RTP packets using Ethreal and RTP stack application at the same time.

Wednesday, February 13, 2008

How can we effectly use Media Detector

Media Detector (MediaDet)
The Media Detector (MediaDet) object retrieves format information and still frames from file sources. The Media Detector does not require the Filter Graph Manager to function. To create this object, call CoCreateInstance. The class identifier is CLSID_MediaDet.
To read Windows Media™ files, the application must provide a software certificate, also called a key. Register the application as a key provider through the Media Detector's IObjectWithSite interface. For more information, see
Unlocking the Windows Media Format SDK.
The MediaDet object exposes the following interfaces:
IMediaDet
IObjectWithSite


IMediaDet Interface
The IMediaDet interface retrieves information about a media file, such as the number of streams, and the media type, duration, and frame rate of each stream. It also contains methods for retrieving individual frames from a video stream. The Media Detector (MediaDet) object exposes this interface.
To obtain information about a file using this interface, perform the following steps:
Create an instance of the MediaDet object by calling CoCreateInstance. The class ID is CLSID_MediaDet.
Call IMediaDet::put_Filename to specify the name of the source file.
Call IMediaDet::get_OutputStreams to obtain the number of output streams in the source.
Call IMediaDet::put_CurrentStream to specify a particular stream.
Call any of the following methods:
IMediaDet::get_FrameRate
IMediaDet::get_StreamLength
IMediaDet::get_StreamMediaType
IMediaDet::get_StreamType
To retrieve a video frame, call IMediaDet::GetBitmapBits or IMediaDet::WriteBitmapBits. The returned frame is always in 24-bit RGB format.
Video Thumbnail feature can also be achieved by this Media Detector.

What is Motion JPEG ?

What is Motion JPEG ?
In multimedia, Motion JPEG (M-JPEG) is an informal name for multimedia formats where each video frame or interlaced field of a digital video sequence is separately compressed as a JPEG image. It is often used in mobile appliances such as digital cameras.

Monday, February 11, 2008

Display settings affected the Filter's RGB output

YV12 to RGB565 Filter is not working if theDesktop display settings have 32 bit mode.

YV12 to RGB565 Filter Gives output as RGB16 if the Desktop Mode is RGB16YV12 to RGB565 Filter Gives output as RGB32 if the Desktop Mode is RGB32Even though we set the Output Format as RGB16 for the YV12To RGB565 Filter, it is being set as RGB32.
if I developed the Filter with Output as RGB16, the display settings mode is as follows:
1.Based on display settings mode,The Output RGB format of the Filter will be determined Ex:
I developed the YV12To RGB565 Filter.It outputs RGB565 format. RGB565 format is working if the Display settings mode is as 16 bit mode.
But while I checking the Filter's Output Pin properties and it becomes RGB32 if Displaysettings mode is as 32 bit.

So I Checked the YV12ToRGB Filter's output format as RGB565 within CheckTransform() fn.if I checked like this, then I got the problem. The YV12ToRGB filter is not connected to the MPEG4 Decoder Filter.
How can we solve this issue ?
based on the Display settings mode, i set the output Buffer size as RGB 16 or RGB 32.

Display settings affected the Filter's RGB output

YV12 to RGB565 Filter is not working if theDesktop display settings have 32 bit mode.


YV12 to RGB565 Filter Gives output as RGB16 if the Desktop Mode is RGB16YV12 to RGB565 Filter Gives output as RGB32 if the Desktop Mode is RGB32Even though we set the Output Format as RGB16 for the YV12To RGB565 Filter, it is being set as RGB32.
if I developed the Filter with Output as RGB16, the display settings mode is as follows:
1.Based on display settings mode,The Output RGB format of the Filter will be determined Ex:
I developed the YV12To RGB565 Filter.It outputs RGB565 format. RGB565 format is working if the Display settings mode is as 16 bit mode.
But while I checking the Filter's Output Pin properties and it becomes RGB32 if Displaysettings mode is as 32 bit.

So I Checked the YV12ToRGB Filter's output format as RGB565 within CheckTransform() fn.if I checked like this, then I got the problem. The YV12ToRGB filter is not connected to the MPEG4 Decoder Filter.
How can we solve this issue ?
based on the Display settings mode, i set the output Buffer size as RGB 16 or RGB 32.

ffmpeg Command To convert the audio to mp2

ffmpeg Command To convert the audio to mp2 :
-------------------------------------------------
ffmpeg -i "D:\Media\Vel1.mpg" -vn -acodec mp2 -ac 1 -ar 22050 -ab 64k "D:\Media\vel1.mp2"

while opening the MP2 file in aGraphEdit , I got the following :

Vel1.mp2 -> MPEG-1 Stream Splitter (Audio)-> MPEG Audio Decoder -> Default Directsound Device.

By Default all the filters are available in Dshow as In-built.

what is MP2 audio ?

what is MP2 audio ?
MPEG-1 Audio Layer II (MP2, sometimes Musicam) is an audio codec defined by ISO/IEC 11172-3. An extension exists: MPEG-2 Layer II and is defined in ISO/IEC 13818-3. The file extension for files containing such audio data is usually .mp2. While it has largely been superseded by MP3 for PC and Internet applications, it remains a dominant standard for audio broadcasting as part of the DAB digital radio and DVB digital television standards.

Wednesday, February 06, 2008

How can we debug the Filter's with GraphEdit ?

How can we debug the Filter's with GraphEdit ?

within the Filter development project Do the following.

Put the Breakpoint in Transform() fn or where the breakpoint is needed.
Project->Settings->Debug ->Executable for Debug session as C:\DXSDK\Bin\DXUtils\graphedt.exe.

and Set the Working Directory as Filter Project's directory as follows :

D:\SampleApps\Directshow\MPEG4DecoderFilter

Now Press F5 this will launch the GraphEdit and insert our filter and run the Graph . u will get
the breakpoint debugging control.

YUV420P to YV12 format conversion

I converted the YUV420P to YV12 format using the following :


The YV12 format is essentially the same as YUV420p, but it has the U and V data reversed: the Y values are followed by the V values, with the U values last.Our output is YUV420P.

Our MPEG4 Decoder's output is YUV420P. So we need to convert it as
YV12. I used the following code to convert from YUV420P to YV12.

long ulNumberOfPixels = m_iWidth * m_iHeight;
long ulUVBufferSize = (m_iWidth * m_iHeight) /4;
unsigned char ch; for(long i = 0; i < ulUVBufferSize; i++)
{
ch = pbYUVData[ulNumberOfPixels + i] ;
pbYUVData[ulNumberOfPixels + i] = pbYUVData[ ulNumberOfPixels + ulUVBufferSize + i];
pbYUVData[ ulNumberOfPixels + ulUVBufferSize + i] = ch;
}

Resources are not Properly Released in DShow Transform Filter ...

Resources are not Properly Released in DShow Transform Filter ...

The EndStreaming() fn is not called properly to release the resources.Afterwards I looked on the Dshow MSDN CTransformFilter Docs.it is not EndStreaming() fn.
we have to implement StopStreaming() fn to release the Resources.
Next I got the error like this.
---------------------------
GraphEdit
---------------------------
Some filters reported an error while stopping. The graph may become unpredictable. Unspecified error (Return code: 0x80004005)
---------------------------
OK
---------------------------
So I put the breakpoint and checked the application.The cause of the Error is

BYTE* pbData;
without allocating memory to it,I released it as follows :

if(pbData) { delete[] pbData; pbData = NULL; }

it causes the failure in execution.
So if we are allocating the memory for the pointer, just set it as NULL during initialization.

How can we skip the frame in a Temporal Decoder Filter ?

How can we skip the frame in a Temporal Decoder Filter ?

I developed the Temporal Decoder filter. During Decoding, I am not able to get the temporal decoded data.
My Temporal Decoder filter developed by deriving a CTransformFilter class..

CTransformFilter's Transform() fn, if we are not having decoded data,
then we have to skip the Current Frame.

if we return S_FALSE within Transform() fn. it will skip the current frame.

Example :

HRESULT CTemporalDecoder :: Transform(IMediaSample* pSrc, IMediaSample* pDest)
{
return DecodeBuffer(Src,pDest);
}

HRESULT DecodeBuffer(IMediaSample* pSrc, IMediaSample* pDest)
{
BYTE* pbEncodedData;
pSrc->GetPointer( &pbEncodedData);

if Decode(pbEncodedData ) //doesnt return the DecodedBuffer
{
return S_FALSE; //Skip the Current Frame.
}
}

MPEG4 Frame Construction( Identify the Frame Size)

I have to construct the Frames from the Bytes.
I need to develop the class for this one. and I need to test it with MPEG4 Decoder SDK in MPeg4DecoderExe;
Algorithm For constructing frame from bytes :
-----------------------------------------------
1.Pass Bytes to the algorithm with size. Add these Bytes to the custom Buffer.
2.Check For the StartCode 00 00 01 B6 and count the number of Bytes and
check for the beginning of a new frame.
3.if we found a frame, copy and send it to the MPEG4FeedData() and removed the frame bytes from the custom Buffer.
4.Call the PullData() fn...

Tuesday, February 05, 2008

How can we idenitfy the FrameSize in m4v file?...

How can we idenitfy the FrameSize ?.

Starts From 0x00 00 01 B6 to
Next 0x00 00 00, We can calculate the Buffer Size. That is the Frame Size.


0000 01 B0 ...
00 00 01 B6...
00 00 01 xx

if XX is B6 then there is no problem...

if XX is other than B6, we have to take it as a header.

So Next I have to develop the application to frame the buffer.

Friday, February 01, 2008

DLL registration problem

I copied the PushSource Filter.

Push source FIlter has 3 Filters. PushPinBitmap, PushPinBitmapSet and PushPinDesktop.
I removed the code for PushPinBitmap, PushPinBitmapSet .


If I compile the filter, without error it is compiling. But the Problem is while registering the Filter(.ax file), I got the following error.
---------------------------
RegSvr32
---------------------------
DllRegisterServer in D:\PushSource.ax failed. Return code was: 0xc0000005 ( Unrecoverable Error)
---------------------------
OK
---------------------------

So the problem lies in DLL registration part.


CFactoryTemplate g_Templates[3] =
{ { g_wszPushDesktop, // Name

&CLSID_PushSourceDesktop, // CLSID
PushSourceDesktop::CreateInstance, &sudPushSourceDesktop } };

This is Factory template.

Solution :
-------------
I modified the CFactorytemplate array as follows :
CFactoryTemplate g_Templates[] // Note here
= { }

MPEG4's Maximum Frame Size

H.263 specifies maximum frame size by means of "BPPmaxKb" parameter.BPPmaxKb defines the maximum frame size in terms of 1024 bits units.The standard specifies table for BPPMaxKb, which is dependentresolution. For QCIF, value of BPPmaxKb is 64 units, which translatesinto 8 Kbytes.

Does any body know whether there is limitation for maximum H.263 framesize?
For mpeg4 simple profile QCIF the spec said:

ISO/IEC 14496-2:2003(E) \ Annex D: Video buffering verifier \ D.2 VideoRate Buffer Model Definition clause 10:

The number of bits used for coding any single VOP, di, shall not exceedk * 16384 bits, where k = 4 for QCIF and Sub-QCIF, k = 16 for CIF, k =32 for 4CIF, and k = 64 for 16CIF, unless a larger value of k isspecified in the profile and level definition.
For QCIF, k=4. So maximum encoded frame size should be 8k bytes.

RTP Source Filter or SDP Source Filter ?

Develop the Stories :

-----------------------

Way1 :

-------

1.Implement the IFileSourceFilter in RTP source filter

2.In IFileSourceFilter's Load() fn,Do the following:


i)if the VOS header is available, then Set this VOS header to MPEG2VIDEOINFO's dwSequenceHeader and Set the Media type MPEG2VIDEOINFO 's width and height and further informations


ii) Wait For voS or VOL header. if the VOS header is not available, then check for the VOL header. if the VOL header is available then Frame the VOS header based on VOL information and Hardcode the profile and level indication and construct the VOS header


3.Later convert it to Socket and Instead of RTP MPEG4 Source Filters, we may call it as SDP Source Filter. RTP Source Filter ? or SDP Source Filter ? There will be no RTP Source filter only SDP Source Filter? Combine more than one stream.


RTP Source Filter or SDP Source Filter ?

Develop the Stories :
-----------------------
Way1 :
-------
1.Implement the IFileSourceFilter in RTP source filter
2.In IFileSourceFilter's Load() fn,Do the following:

i)if the VOS header is available, then Set this VOS header to MPEG2VIDEOINFO's dwSequenceHeader and Set the Media type MPEG2VIDEOINFO 's width and height and further informations

ii) Wait For voS or VOL header. if the VOS header is not available, then check for the VOL header. if the VOL header is available then Frame the VOS header based on VOL information and Hardcode the profile and level indication and construct the VOS header

3.Later convert it to Socket and Instead of RTP MPEG4 Source Filters, we may call it as SDP Source Filter. RTP Source Filter ? or SDP Source Filter ? There will be no RTP Source filter only SDP Source Filter? Combine more than one stream.