Grab Frame from RTPClient Stream using H.264 on Xamarin

Topics: Question
Mar 16, 2016 at 1:26 PM
Hello Julius,

thanks for your help, I got my RTPClient working now and receive RTPPackages and Frames.

I want to hear Audio and Show Video now on both an android and ios device.
I am using Xamarin with Visual Studio 2015.

I have read here: "Grab Frame from H.264 RFC6184Stream"
https://net7mma.codeplex.com/discussions/569279

Where you at the end Point to a sample in test:
TestRFC6184Frame

I cannot find it, was it removed?

Both here always used Media.Rtsp.Server.MediaTypes.RFC6184Media.RFC6184Frame and csCodes do not seem to work with Xamarin because they use System.Drawing 4.0 which is not available there.

Do you have any idea where or how to start when trying to Show some rtp package Frame using xamarin?

Thanks a lot again,
Firlefanz
Mar 16, 2016 at 1:45 PM
I found it: static void TestRFC6184VideoFrame()
:-)
Coordinator
Mar 16, 2016 at 3:56 PM
Edited Mar 16, 2016 at 3:59 PM
I am working on this among other things, unfortunately I don't have much to offer in this area right now.

After I get entropy encoding and quantization examples working on the encoder I will probably implement some type of IFrame only decoder.

After Media.Image is complete I can replace the use of System.Drawing with that and work on more of a full Encoder and Decoder solution.

There is also JPEG, PNG, Bitmap and GIF to consider among many others... :)

And yes correct you can use external decoders by preparing the depacketized stream as that example shows, if you need anything else let me know!
Marked as answer by juliusfriedman on 3/16/2016 at 8:56 AM
Mar 21, 2016 at 11:43 AM
Hello Julius,

I still cannot get it working, maybe you have an idea what I am doing wrong...

RTP packets are received and look ok on my Android phone 4.4 or tablet 5.0.

So before activating the RTPClient I remembered the Port for Media (Video here):
                foreach (var context in rtpClient.GetTransportContexts())
                {
                    if (context.MediaDescription.MediaType == MediaType.audio)
                    {
                        AudioPort = context.Localport;
                        //MediaDescription md = context.MediaDescription;
                        //if (md != null)
                        //    remotePort = md.MediaPort;
                    }
                    else if (context.MediaDescription.MediaType == MediaType.video)
                    {
                        VideoPort = context.Localport;
                        MediaDescription md = context.MediaDescription;
                        if (md != null)
                            remotePort = md.MediaPort;
                    }
                }
The Forums say that the Xamarin Android Videoview (or Android Mediaplaye here player) can show a H264 coded stream.

So I tried to open the stream (which works on PC opened with VCL Player very good):

playVideoAction(String.Format("rtp://{0}:{1}", SipClientManager.ServerIP, remotePort));
...

videoview.SetVideoURI(Android.Net.Uri.Parse(fullPath));
videoview.Start();

or

player.SetDataSource(videoview.Context, Android.Net.Uri.Parse(fullPath));
player.Prepare();
player.Start();

The videoview just says "Cannot Play Video" then Player Prepare does Crash.

So I tried the other way, I used the session description I start my 'RTPClient and wrote it into a file.

string fileName = Path.Combine(System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal), "IncomingCall.sdp");

Then I used both ways from above with filename instead or the parsed uri.
I get the same Errors.

Are These ideas wrong? Do I have to use a total different way? Can you recommend a way or give me some idea here?

Thanks a lot again,

Firlefanz
Mar 21, 2016 at 1:00 PM
Hello Julius,

on the Windows Client, so I can post the saved file here, the sdp file Looks like this:

v=0
o=root 1220363117 1220363117 IN IP4 192.168.1.109
s=Asterisk PBX GIT-master-60a15fe
c=IN IP4 192.168.1.109
b=CT:384
t=0 0
m=audio 18544 RTP/AVP 0 8
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=maxptime:150
a=sendrecv
m=video 24314 RTP/AVP 99 34
a=rtpmap:99 H264/90000
a=fmtp:99 sprop-parameter-sets=Z0KADJWgUH5A,aM4Ecg==
a=rtpmap:34 H263/90000
a=sendrecv
Coordinator
Mar 21, 2016 at 4:00 PM
Edited Mar 21, 2016 at 4:00 PM
Based on your previous capture I would say you need to ensure that the ports in the Invite Response (ACK) you send back are the same as the ports the RtpClient is going to be listening on, the code you have above is correct for obtaining the MediaPort from the RtpClient's TransportContext's but I don't know if that is the port in which the Gateway expects data to be send and received on or not without seeing both the Invite and ACK.

I have explained that in the previous thread, if something doesn't make sense let me know what and I will try to explain again.

For the decoding you may want to refer to these posts which are doing similar things:

RTP on Android MediaPlayer

Play RTP stream withouth RTSP in Android VideoView

It seems that using a Rtsp server with Rtp as a source is supported but not from a Rtp session directly even with the Session Description.

You may want to just consider manually using MediaCodec to do this for you, the following thread links to a few good resources for this.

The API is found hereMediaCodec Class

Android MediaCodec decode h264 raw frame

Let me know if you need anything else!
Marked as answer by juliusfriedman on 3/21/2016 at 9:00 AM