Onvif Camera how to save rtsp stream to file

Topics: Question
Jun 20, 2016 at 3:44 PM
Edited Jun 20, 2016 at 4:05 PM
Hallo,

I'am trying to save a stream (H264) from my onvif camera to file. Only video at the moment. But the file generated is not useable. I try to play the file in my VLC Player, but i Just see some pixel, most of it is green or gray.

One thing, the file is growing very slowly, just 1KB in 20 seconds, that seems also very strange?

Here is my Code, maybe i did something wrong:
public partial class MainWindow : Window
    {
        public MainWindow()
        {
            InitializeComponent();
        }

        private void Button_Click(object sender, RoutedEventArgs e)
        {
            RtspClient client = new RtspClient("rtsp://192.168.1.51:554/onvif1", RtspClient.ClientProtocolType.Udp);     
            client.OnConnect += Client_OnConnect;
            client.OnResponse += Client_OnResponse;
            client.OnPlay += Client_OnPlay;
            client.Connect();
        }



        private void Client_OnResponse(RtspClient sender, RtspMessage request, RtspMessage response)
        {
            Console.WriteLine("request-------------------");
            Console.WriteLine(request.ToString());
            Console.WriteLine("response-------------------");
            Console.WriteLine(response.ToString());
        }

        private void Client_OnConnect(RtspClient sender, object args)
        {
            if (sender.IsConnected)
            {
                if (!sender.IsPlaying)
                {
                    sender.SocketReadTimeout = sender.SocketWriteTimeout = 30000;
                    sender.StartPlaying();
                }
            }
        }

        private void Client_OnPlay(RtspClient sender, object args)
        {
            if (sender.IsPlaying)
            {
                sender.Client.FrameChangedEventsEnabled = true;
                sender.Client.RtpFrameChanged += Client_RtpFrameChanged;
            }
        }


        bool m_InitializedStream;

        private void Client_RtpFrameChanged(object sender, Media.Rtp.RtpFrame frame = null, Media.Rtp.RtpClient.TransportContext tc = null, bool final = false)
        {
            string outputFileName = "Test.h264";
            using (var fs = new System.IO.FileStream(outputFileName, System.IO.FileMode.Append))
            {
                if (false == final || Media.Common.IDisposedExtensions.IsNullOrDisposed(frame) || false == frame.IsComplete) return;

                var context = tc ?? ((Media.Rtp.RtpClient)sender).GetContextByPayloadType(frame.PayloadType);

                if (context == null || context.MediaDescription.MediaType != Media.Sdp.MediaType.video) return;

                using (Media.Rtsp.Server.MediaTypes.RFC6184Media.RFC6184Frame profileFrame = new Media.Rtsp.Server.MediaTypes.RFC6184Media.RFC6184Frame(frame))
                {
                    profileFrame.Depacketize();
                    if (false == profileFrame.HasDepacketized) return;

                    //If there is not a sps or pps in band and this is the first frame given to a decoder then it needs to contain a SPS and PPS
                    //This is typically retrieved from the SessionDescription or CodecPrivateData but only the very first time.
                    if (m_InitializedStream == false && (false == profileFrame.ContainsSequenceParameterSet || false == profileFrame.ContainsPictureParameterSet))
                    {
                        Media.Sdp.Lines.FormatTypeLine fmtp = new Media.Sdp.Lines.FormatTypeLine(context.MediaDescription.FmtpLine);

                        //Media.Sdp.Lines.FormatTypeLine fmtp = new Media.Sdp.Lines.FormatTypeLine(Media.Sdp.SessionDescriptionLine.Parse("a=fmtp:97 packetization-mode=1;profile-level-id=42C01E;sprop-parameter-sets=Z0LAHtkDxWhAAAADAEAAAAwDxYuS,aMuMsg=="));
                        //if (false == fmtp.HasFormatSpecificParameters) throw new System.Exception("HasFormatSpecificParameters is false");
                        byte[] sps = null, pps = null;

                        //If there was a fmtp line then iterate the parts contained.
                        foreach (string p in fmtp.Parts)
                        {
                            //Determine where in the string the desired token in.
                            string token = Media.Common.Extensions.String.StringExtensions.Substring(p, "sprop-parameter-sets=");

                            //If present extract it.
                            if (false == string.IsNullOrWhiteSpace(token))
                            {
                                //Get the strings which corresponds to the data without the datum split by ','
                                string[] data = token.Split(',');

                                //If there is any data then assign it

                                if (data.Length > 0) sps = System.Convert.FromBase64String(data[0]);

                                if (data.Length > 1) pps = System.Convert.FromBase64String(data[1]);

                                //Done
                                break;
                            }
                        }

                        //Prepend the SPS if it was found
                        if (sps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fs.WriteByte(0);

                            //Write the start code
                            fs.Write(Media.Codecs.Video.H264.NalUnitType.StartCodePrefix, 0, 3);

                            //Write the SPS
                            fs.Write(sps, 0, sps.Length);
                        }
                        else throw new System.Exception("SequenceParameterSet not found");

                        //Prepend the PPS if it was found.
                        if (pps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fs.WriteByte(0);

                            //Write the start code
                            fs.Write(Media.Codecs.Video.H264.NalUnitType.StartCodePrefix, 0, 3);

                            //Write the PPS
                            fs.Write(pps, 0, pps.Length);
                        }
                        else throw new System.Exception("PicutureParameterSet not found");

                        m_InitializedStream = true;
                    }

                    profileFrame.Buffer.CopyToStream(fs);

                }
            }
        }
    }
Requests/responses in my console, they look good I think:
request-------------------
OPTIONS rtsp://192.168.1.51:554/onvif1 RTSP/1.0
CSeq: 1

response-------------------
RTSP/1.0 200 OK

CSeq: 1
Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER,USER_CMD_SET


request-------------------
DESCRIBE rtsp://192.168.1.51:554/onvif1 RTSP/1.0
Accept: application/sdp
CSeq: 2


response-------------------
RTSP/1.0 200 OK

CSeq: 2
Content-Type: application/sdp
Content-Length: 421

v=0
o=- 1421069297525233 1 IN IP4 192.168.1.51
s=H.264 Video, RtspServer_0.0.0.2
t=0 0
a=tool:RtspServer_0.0.0.2
a=type:broadcast
a=control:*
a=range:npt=0-
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:500
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;profile-level-id=42001F;sprop-parameter-sets=Z0IAH5WoFAFuQA==,aM48gA==
a=control:track1
m=audio 0 RTP/AVP 8
a=control:track2
a=rtpmap:8 PCMA/8000


request-------------------
SETUP rtsp://192.168.1.51/onvif1/track1 RTSP/1.0
Transport: RTP/AVP/UDP;unicast;client_port=10000-10001;mode="PLAY"
CSeq: 3


response-------------------
RTSP/1.0 200 OK

CSeq: 3
Transport: RTP/AVP;unicast;destination=192.168.1.39;source=192.168.1.51;client_port=10000-10001;server_port=7054-7055
Session: 27191f96;timeout=60


request-------------------
SETUP rtsp://192.168.1.51/onvif1/track2 RTSP/1.0
Transport: RTP/AVP/UDP;unicast;client_port=10003-10004;mode="PLAY"
CSeq: 4
Session: 27191f96


response-------------------
RTSP/1.0 200 OK

CSeq: 4
Transport: RTP/AVP;unicast;destination=192.168.1.39;source=192.168.1.51;client_port=10003-10004;server_port=7056-7057
Session: 27191f96;timeout=60
Coordinator
Jun 20, 2016 at 4:00 PM
Seems okay to me, You probably are dropping packets.

You will want to verify this by checking the sequence numbers of received frames or otherwise using the state of the TransportContext.

If you are dropping packets you may want to try increasing the ReceiveBufferSize; each TransportContext has a RecieveBufferSizeMultiplier which can be increased or decreased when required.

This code will set the Rtp and Rtcp receive buffer sizes for each TransportContext a RtpClient instance contains.
foreach(var tc in RtspClient.Client.GetTransportContexts())
            {
                Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(tc, 8192);
            }
If you need anything else let me know!
Marked as answer by juliusfriedman on 6/20/2016 at 9:00 AM
Jun 20, 2016 at 4:23 PM
Edited Jun 20, 2016 at 4:24 PM
Ok, i added this line to my Client_RtpFrameChanged eventhandler (Console.WriteLine(frame.LowestSequenceNumber + "/" + frame.HighestSequenceNumber);):
string outputFileName = "Test.h264";
            using (var fs = new System.IO.FileStream(outputFileName, System.IO.FileMode.Append))
            {

                Console.WriteLine(frame.LowestSequenceNumber + "/" + frame.HighestSequenceNumber);

                if (false == final || Media.Common.IDisposedExtensions.IsNullOrDisposed(frame) || false == frame.IsComplete) return;
And the output is:

24335/24335
24337/24337
24336/24336
24338/24338
24337/24337
24339/24339
24338/24338
24340/24340
24339/24339
24341/24341
24340/24340
24342/24342
24341/24341
24343/24343
24342/24342
24344/24344
24343/24343
24345/24345
24344/24344
24346/24346
24345/24345
24347/24347
24346/24346
24348/24348
24347/24347
24349/24349
24348/24348
24350/24350
24349/24349
24351/24351
24350/24350
24352/24352
24351/24351
24353/24353
24352/24352
24354/24354
24353/24353
24355/24355
24354/24354
24356/24356
24355/24355
24357/24357
24356/24356
24358/24358
24357/24357
24359/24359
24358/24358
24360/24360
24359/24359
24361/24361
24360/24360
24362/24362
24361/24361
24363/24363
24362/24362
24364/24364
24363/24363
24365/24365
24364/24364
24366/24366

Can you explain what LowestSequenceNumber and HighestSequenceNumber means? And why are they unsorted in my output?
Jun 20, 2016 at 4:30 PM
Edited Jun 20, 2016 at 4:34 PM
Ok my fault... I had to place this line later in code:
if (context == null || context.MediaDescription.MediaType != Media.Sdp.MediaType.video) return;

                Console.WriteLine(frame.LowestSequenceNumber + "/" + frame.HighestSequenceNumber);
Now it looks good:
26198/26203
26204/26207
26208/26211
26212/26214
26215/26215
26216/26218
26219/26219
26220/26221
26222/26225
26226/26228
131109139573006083@RtspMessage

@Finalize Completed
131109139573006083@RtspMessage

@Finalize Completed
131109139573016000@RtspMessage

@Finalize Completed
131109139573016000@RtspMessage

@Finalize Completed
131109139573016000@RtspMessage

@Finalize Completed
26229/26231
26232/26234
26235/26236
26237/26238
26239/26241
26242/26244
26245/26245
26246/26246
26247/26247
26248/26248
26249/26250
26251/26252
26253/26253
26254/26254
26255/26255
26256/26256
26257/26257
26258/26258
26259/26259
26260/26260
26261/26261
26262/26262
26263/26263
26264/26264
26265/26265
26266/26266
26267/26267
26268/26268
26269/26269
26270/26270
131109140079807799@RtspMessage

@Finalize Completed
131109140079817819@RtspMessage

@Finalize Completed
131109140079817819@RtspMessage

@Finalize Completed
26271/26271
26272/26272
26273/26273

Is this normal?
Coordinator
Jun 20, 2016 at 4:34 PM
Edited Jun 20, 2016 at 4:36 PM
Sure, HighestSequence number and LowestSequenceNumber are relative to the frame in the event. The highest and lowest contained SequenceNumbers in the frame...

They are not unsorted..., they are occurring as they naturally arrive.

You will receive the FrameChanged event once for every frame when a packet arrives and is added to the frame.

You will also receive another event when the frame is being disposed, the different is that the 'final' parameter will be true allowing you a last chance to do something with the packet.

You can process the packets when they arrive or at the 'final' event.

In your code you seem to be checking for the 'final' even or frame completion and then processing the packets within the frame.

If you process the packets in the frame before final is true then ensure to Dispose the frame as it will be received in another again if you do not.

If you want to just handle the frames at the last possible moment, then only check 'final'

If after 'final' is true the frame is not complete you can either drop the frame or attempt to recover the data therein.

Usually at such a point there is already another frame already received and waiting to be evented.

-Julius
Marked as answer by juliusfriedman on 6/20/2016 at 9:35 AM
Jun 23, 2016 at 2:04 PM
Edited Jun 24, 2016 at 10:45 AM
Ok, i think my problem is, i have to use RFC 3984 : http://www.ietf.org/rfc/rfc3984.txt

But you have no such frame class. So can you advise me how to start?

At the moment the video file looks like:
http://abload.de/image.php?img=vlc10u9q.jpg

The left top part seems to be correct... I need help.
Coordinator
Jun 24, 2016 at 7:41 PM
That RFC was obsoleted by RFC6184 so I essentially DO have a frame class.

See RFC3984

Which became RFC6184

And is implemented here Here

As well as the Reduced Complexity Variant RFC6185 And I have also implemented that Here

How can I help you?
Marked as answer by juliusfriedman on 6/24/2016 at 12:41 PM
Jun 25, 2016 at 6:00 AM
Ok, thanks for your answers. And Your are right, RFC6184 should work.

First I have to say I'am absolutly new to RTPS/RTP and video decoding, so please be patient with me.

My main goal is to get the video stream and audio stream from several ip camers and show it on android/IOS/Winphone with xamarin.

SO my first goal is to get the rtsp stream and save it in a file for testing (Just play it with VLC Player). But I fail.

I changed my code and now I'am just looking for final == true. But the live stream video file I create seems to be broken... See my screenshot from my last post.

So what should I do?
Coordinator
Jun 25, 2016 at 3:09 PM
Ensure your calling SetShouldDispose on the frame if your waiting for the final parameter to be true as the RtpClient will likely be in the middle of disposing it at the same time unless whoever is subscribed to the events can indicate otherwise.

Here is some code I have been using for testing purposes, it shows how to depacketize the frame and pass it to a decoder as well as show the Timestamps and other useful information.
void DumpFrame(Media.Rtp.RtpFrame frame, ref bool final)
        {
            if (Media.Common.IDisposedExtensions.IsNullOrDisposed(frame)) return;

            int frameCount = frame.Count;

            if (frameCount.Equals(0)) return;

            System.Text.StringBuilder output = new System.Text.StringBuilder();

            output.Append("PT = ");

            output.Append(frame.PayloadType);

            output.Append(" ");

            if (final) output.Append("Final @ ");

            output.Append(frame.Timestamp + " / ");

            output.Append(frameCount);

            if (frameCount > 0)
            {
                output.Append(" - [" + frame.HighestSequenceNumber + " , " + frame.LowestSequenceNumber + "]");
            }
            else
            {
                output.Append(" - [" + frame.HighestSequenceNumber + "]");
            }

            output.Append(" MarkerCount = " + frame.MarkerCount);

            if (final)
            {
                bool complete = frame.IsComplete;

                if (false.Equals(complete)) ++missingPacketsOnFinal;

                output.Append(" Complete = " + frame.IsComplete);
            }
            else
            {
                output.Append(" IsMissingPackets = " + frame.IsMissingPackets);
            }

            System.Diagnostics.Trace.WriteLine(output);


            if(final && RtspClient.Client.GetContextByPayloadType(frame.PayloadType).MediaDescription.MediaType == Media.Sdp.MediaType.video)
            {
                DepacketizeFrame(frame);
            }

        }

        [System.Runtime.CompilerServices.MethodImpl(System.Runtime.CompilerServices.MethodImplOptions.AggressiveInlining)]
        void DepacketizeFrame(Media.Rtp.RtpFrame frame)
        {
            if (Media.Common.IDisposedExtensions.IsNullOrDisposed(frame) || decoder == null) return;            

            Media.Common.BaseDisposable.SetShouldDispose(frame, false, false);            

            using (RFC6184Frame instance = new RFC6184Frame(frame))
            {
                instance.Depacketize();

                if (instance.HasDepacketized)
                {
                    decoder.DecodeFrame(instance.Buffer.ToArray());
                }
            }

            Media.Common.BaseDisposable.SetShouldDispose(frame, true, false);

        }
You should be able to exchange the decoder.DecodeFrame with writing to a file or otherwise.


There are a few other patterns also e.g. You can also optionally handle the frame events earlier and Dispose the frame yourself or disable the frame events and just use the packet events, you can also use the RtpTools to make your life a little easier or implement a different format e.g. PCAP.

I will hopefully have time soon to add MP4 writing and finish up the reading support.

If you need anything further let me know!
Marked as answer by juliusfriedman on 6/25/2016 at 8:09 AM
Jun 26, 2016 at 9:28 AM
Ok, thank you. But still the video file is broken. Now I changed my test to "rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov".

But even with this, I am not able to create a video file without artefacts. (Tried with vlc Player and mplayer)

Can you provide sourcecode where a file is generated without artefacts?

Here is my Code:
class Program
    {
        static void Main(string[] args)
        {
            RtspClient client = new RtspClient("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov", RtspClient.ClientProtocolType.Tcp);
            client.OnConnect += Client_OnConnect;
            client.OnResponse += Client_OnResponse;
            client.OnPlay += Client_OnPlay;
            client.Connect();

            Console.ReadKey(); 
        }


        private static void Client_OnResponse(RtspClient sender, RtspMessage request, RtspMessage response)
        {
            //Console.WriteLine("request-------------------");
            //Console.WriteLine(request.ToString());
            //Console.WriteLine("response-------------------");
            //Console.WriteLine(response.ToString());
        }

        private static void Client_OnConnect(RtspClient sender, object args)
        {
            if (sender.IsConnected)
            {
                if (!sender.IsPlaying)
                {
                    sender.SocketReadTimeout = sender.SocketWriteTimeout = 30000;
                    sender.StartPlaying();
                }
            }
        }

        private static void Client_OnPlay(RtspClient sender, object args)
        {
            if (sender.IsPlaying)
            {
                sender.Client.FrameChangedEventsEnabled = true;
                sender.Client.RtpFrameChanged += Client_RtpFrameChanged;
            }
        }


        static bool firstFrame = true;

        [System.Runtime.CompilerServices.MethodImpl(System.Runtime.CompilerServices.MethodImplOptions.AggressiveInlining)]
        static void DepacketizeFrame(Media.Rtp.RtpFrame frame, Media.Rtp.RtpClient.TransportContext context)
        {
            if (Media.Common.IDisposedExtensions.IsNullOrDisposed(frame)) return;

            Media.Common.BaseDisposable.SetShouldDispose(frame, false, false);

            if (firstFrame)
            {
                firstFrame = false;
                AddInfo(context);
            }

            using (RFC6184Media.RFC6184Frame instance = new RFC6184Media.RFC6184Frame(frame))
            {
                instance.Depacketize();

                if (instance.HasDepacketized)
                {
                    //decoder.DecodeFrame(instance.Buffer.ToArray());
                    using (var fs = new System.IO.FileStream("Test.h264", System.IO.FileMode.Append))
                    {
                        instance.Buffer.CopyToStream(fs);
                    }
                }
                else
                {

                }
            }

            Media.Common.BaseDisposable.SetShouldDispose(frame, true, false);

        }

        private static void AddInfo(Media.Rtp.RtpClient.TransportContext context)
        {
            using (var fs = new System.IO.FileStream("Test.h264", System.IO.FileMode.Append))
            {

                Media.Sdp.Lines.FormatTypeLine fmtp = new Media.Sdp.Lines.FormatTypeLine(context.MediaDescription.FmtpLine);

                byte[] sps = null, pps = null;

                //If there was a fmtp line then iterate the parts contained.
                foreach (string p in fmtp.Parts)
                {
                    //Determine where in the string the desired token in.
                    string token = Media.Common.Extensions.String.StringExtensions.Substring(p, "sprop-parameter-sets=");

                    //If present extract it.
                    if (false == string.IsNullOrWhiteSpace(token))
                    {
                        //Get the strings which corresponds to the data without the datum split by ','
                        string[] data = token.Split(',');

                        //If there is any data then assign it

                        if (data.Length > 0) sps = System.Convert.FromBase64String(data[0]);

                        if (data.Length > 1) pps = System.Convert.FromBase64String(data[1]);

                        //Done
                        break;
                    }
                }

                //Prepend the SPS if it was found
                if (sps != null)
                {
                    //Emulation prevention, present for SPS or PPS
                    fs.WriteByte(0);

                    //Write the start code
                    fs.Write(Media.Codecs.Video.H264.NalUnitType.StartCodePrefix, 0, 3);

                    //Write the SPS
                    fs.Write(sps, 0, sps.Length);
                }
                else throw new System.Exception("SequenceParameterSet not found");

                //Prepend the PPS if it was found.
                if (pps != null)
                {
                    //Emulation prevention, present for SPS or PPS
                    fs.WriteByte(0);

                    //Write the start code
                    fs.Write(Media.Codecs.Video.H264.NalUnitType.StartCodePrefix, 0, 3);

                    //Write the PPS
                    fs.Write(pps, 0, pps.Length);
                }
                else throw new System.Exception("PicutureParameterSet not found");
            }
        }

        static int missingPacketsOnFinal;

        static void DumpFrame(Media.Rtp.RtpFrame frame, ref bool final, Media.Rtp.RtpClient.TransportContext context)
        {
            if (Media.Common.IDisposedExtensions.IsNullOrDisposed(frame)) return;

            int frameCount = frame.Count;

            if (frameCount.Equals(0)) return;

            System.Text.StringBuilder output = new System.Text.StringBuilder();

            output.Append("PT = ");

            output.Append(frame.PayloadType);

            output.Append(" ");

            if (final) output.Append("Final @ ");

            output.Append(frame.Timestamp + " / ");

            output.Append(frameCount);

            if (frameCount > 0)
            {
                output.Append(" - [" + frame.HighestSequenceNumber + " , " + frame.LowestSequenceNumber + "]");
            }
            else
            {
                output.Append(" - [" + frame.HighestSequenceNumber + "]");
            }

            output.Append(" MarkerCount = " + frame.MarkerCount);

            if (final)
            {
                bool complete = frame.IsComplete;

                if (false.Equals(complete)) ++missingPacketsOnFinal;

                output.Append(" Complete = " + frame.IsComplete);
            }
            else
            {
                output.Append(" IsMissingPackets = " + frame.IsMissingPackets);
            }

            System.Diagnostics.Trace.WriteLine(output);



            if (final && context.MediaDescription.MediaType == Media.Sdp.MediaType.video)
            {
                DepacketizeFrame(frame, context);
            }
        }


        private static void Client_RtpFrameChanged(object sender, Media.Rtp.RtpFrame frame = null, Media.Rtp.RtpClient.TransportContext tc = null, bool final = false)
        {
            DumpFrame(frame, ref final, tc);
        }
    }
Coordinator
Jun 27, 2016 at 12:23 AM
The code I provide in the UnitTest project should suffice for your purposes e.g. the code in TestRtspClient, by my first glace it looks like you have sufficiently combined the logic in TestRFC6184VideoFrame to encompass the use of the RFC6184Frame class as required for depacketization and thus I see no reason why you would have artifacts or otherwise any issues with viewing unless you were dropping packets or there was some other issues with the application logic.

When you say there are artifacts and from that particular test Uri (The wowza demo) I would ask if turn if you can you provide a Wireshark capture of the test run along with the RtspClient.Logger output of the accompanying logic above and finally the resulting h.264 from 'Test.h264' for my review.

With those 3 things I should be able to provide further meaningful input / feedback which may help you determine exactly what's going on.

I would ask you to test both Udp and Tcp and provide the results of both tests so that I can further determine if this is network / application related.

The one thing I do potentially see is that you are using the FileStream appending and closing it and then very quickly opening it again, you can probably just keep the same FileStream instance open to reduce IO contention which is one possible cause of the output not being written to the output file stream or in the desired order.

You can also use a MemoryStream and wait for a certain amount of data to accumulate in the MemoryStream as the result of DepacketizeFrame to reduce the possibility of causing the RtpClient to miss any packets while you are writing the depacketized output to the result FileStream.

You could also use a Pool or other method of buffering existing allocated arrays to ensure that you have as little GC contention as possible when copying the depacketized Buffer and that the resulting array can be reused after it's been written to the file.

You can turn on ThreadEvents in the RtpClient this will result in slightly more memory and processor utilization but it will allow you to handle the events in a manner which is potentially more desirable in your application and your environment.

If you do utilize ThreadEvents please also keep track of that and differentiate the tests results at least by name so that it's easy to tell the difference when I \ you are analyzing them.

Let me know if you can provide what I asked and I will take a further look and see if I can find anything wrong.

I will also try to provide a complete example for consuming the Wowza video, depacketizing it and comparing the results against the actual mp4 from which it was derived. I will need to make some advancements in the BaseMediaReader and after I am done with that and can extract the samples reliably I will ensure with a UnitTest that the logic works as expected and that same test logic will easily be able to be adapted to other sources for the same purpose.

Sincerely,
Julius
Marked as answer by juliusfriedman on 6/26/2016 at 5:23 PM
Jun 27, 2016 at 9:21 AM
Edited Jun 27, 2016 at 9:55 AM
Thanks for all.

I changed my testcode to avoid the IO Problems. I also tried to use ThreadEvents, but no changes. I uploaded my source, wireshark dumps and logging here:
https://1drv.ms/f/s!AMIVwnu_zKYsfA

I would be nice to hear from you.
Coordinator
Jun 28, 2016 at 1:04 AM
I will review and let you know what I find as soon as possible.

Probably tomorrow or the day after.
Marked as answer by juliusfriedman on 6/27/2016 at 6:04 PM
Coordinator
Jun 28, 2016 at 3:22 AM
Edited Jun 28, 2016 at 3:23 AM
I took some time to start the preliminary investigation into what exactly is going on in your situation....

As confirmed by both dumps it appears the H264 stream has distortion when played back because of missing or corrupted data.

This either either because the Wowza server was maxed out when you made the capture or because you were experiencing a network issue or there exists the possibility that your application is still doing something weird which causes this.

While I don't have a definite response I want to point out a few things to see if you will see where I am leaning towards...

Compare both of your captures.. and the output...

The output is EXACTLY the same (plus the few packets which are not in the other file because it was ended slightly after the other file was)

This is good because it indicates your issue is "packet loss" somewhere...

But where?

If you look at both captures there is also something peculiar...

Right around packet 319 you seem to be experiencing an anomaly....

RTP PT = 97, Sequence Number 319 is completely missing in the ThreadEvents version which is strange.

In the without ThreadEvents version you are missing audio data and video data right around the same sequence number, Packet 317 - 318 this time.

This points to an APPLICATION area logic error which is causing this anomaly to manifest itself, what exactly I am not yet sure as you seem to have everything correctly utilized....

The weird thing is that your application should have output slightly different files for the resulting h.264 but instead what I see is the exact same file even with the missing data....

You may be experiencing IO contention or otherwise in the application layer which I cannot replicate in my tests and thus why I cannot identify the bug, because it's essentially in your code and not mine.

Since you also included the source and I also have access to an Android handset to test on I will give your application some improvements where I see possible and test it to see if I have the same issue, hopefully I am able to replicate the issue and possibly find a bug in the library.

I would also like to include an 'Android' project in the sources which has a Surface and a Button to launch the VideoView so one can see and eventually hear the media besides just write it to a file or see debug statements, I already have such an example which looks surprisingly like your example so I will likely merge the two and include them in the next release if possible.

I will also write back when I have more information.

Please do let me know if you find something else I might have missed or if you see something I didn't touch on.

Sincerely,
Julius
Marked as answer by juliusfriedman on 6/27/2016 at 8:22 PM
Jun 28, 2016 at 5:56 AM
Edited Jun 28, 2016 at 6:11 AM
First, thank you for your huge investment.

I double checked Wowza server with VLC Player Networkstreaming yesterday and 5 minutes ago. Always perfect streaming. But my video file always looks same.

I also tried debug/release with AnyCPU/X86 build with debugger attached and not attached, no changes.

My Testapplication can't be the problem. I just downloaded your sourcecode/ sln and added a new WPF Application Project and added the projektreferences to it. In codebehind I added the sourcecode I provided (.NET 4.6).

I have the same problems on two machines. Here at work I have a Win10 Home with i7-4790K CPU and at home I use a win10 Prof with i7-4790K (=> completly different network). At home i tested it with the newest version of your lib (home:112165, work 112159) and a console application instead of a wpf app (so somplete different solutions). But always the same problems.

I would ask you to provide me a exe (console or wpf) which generates a video file from Wowza Server and producing a video file with works at your side. Then I can test this on serveral machines to ensure problems are on my side.

But as you mentioned, my main goal is not to save stream to file on windows machines (it should be the first step...). I just want to play the stream live on android and later ios. So I have to use a h264 decoder instead of file writing. But i thought it would be the same.

If you have a running xamarin android project, is it possible you can provide i to me?

I uploaded my compiled exe. maybe you try it at your side: https://1drv.ms/u/s!AMIVwnu_zKYsgQ0
Coordinator
Jun 28, 2016 at 1:19 PM
Maybe it has to be with your network or environment in some other capacity if not your application but I still disagree that your application has absolutely no way to be improved...

Yes I probably can provide a test application for Android but my current source code for the test application doesn't write a h.264 file yet so I am not sure how useful you will find it...

If you would like I can go over what I find in your code which has the potential to cause IO issues or I can add the option to write the file to with my test app and compare the results and share it with you?

There is little to no difference in the code which is responsible for writing the data to the file so Android or Wpf should not matter...

Debug versus release may matter depending on how fast the processor is and how much ram is available.

In your application LOG it seems that you definitely have 'DUPLICATE' packets, and your definitely receiving them in the application OUT OF ORDER.
  • PT = 97 Final @ 1091250 / 1 - [317 , 317] MarkerCount = 1 Complete = True
  • PT = 97 1098720 / 1 - [319 , 319] MarkerCount = 1 IsMissingPackets = False
  • PT = 97 Final @ 1095030 / 1 - [318 , 318] MarkerCount = 1 Complete = True
  • PT = 97 1102500 / 1 - [320 , 320] MarkerCount = 1 IsMissingPackets = False
  • PT = 97 Final @ 1098720 / 1 - [319 , 319] MarkerCount = 1 Complete = True
  • PT = 97 1106280 / 1 - [321 , 321] MarkerCount = 1 IsMissingPackets = False
  • PT = 97 Final @ 1102500 / 1 - [320 , 320] MarkerCount = 1 Complete = True
This is why you have distortion. (Combined with loss) There is no other reason. I can assure you this.

In your 'Source.cs' what causes this to occur? I would suspect that your writing packets out of order because of an allocation of the MemoryStream taking longer than you expect to occur or some other allocation e.g.

How does the memory stream queue get populated?

You use 'DepacketizeFrame' to also allocate the MemoryStream which is slightly incorrect as that occurs from the library's receive thread, thus you need to be as quick as possible when handling the events to ensure your application does not cause loss.

You should allocate the MemoryStream instances up front or before you even need them e.g. make two memory streams of the size you need and when one is full or close enough, swap it out with the other stream and begin writing.

Another possible issue is that AddInfo open and closes the file stream, this causes contention because you make another open and close in the same call as well as the allocation for the MemoryStream.

Do you not agree with these things?

Furthermore, since your just writing to a file you probably want to take some efforts to ensure what your writing to the file is in the order you expect; Sometimes a Decoder can handle out of order access units, sometimes it can't so this is an important distinction to make. This is also important if other threads may also be potentially writing to the file at the same time.

Possibly if you add another Thread to your application you can use this Thread to be responsible for the ordering and depackization rather than using just your main thread to handle UI and the worker Thread would be responsible for feeding the data from the library to the components of the application where they are further required e.g. for writing to a file or display.

What points me in this direction is that in BOTH CASES there is some anomaly right at the same point in Time.. e.g. Packet 315 - 320; thus there clearly are issues with re-transmissions of packet 320 either due to network or application logic and NOT the library.

-Julius
Marked as answer by juliusfriedman on 6/28/2016 at 6:19 AM
Jun 28, 2016 at 1:44 PM
I will investigate your suggestions. Meanwhile can you send me your android streaming example? Did you already use a MediaCodec Decoder in this example? Everything could help me.

Thanks,
Mark
Coordinator
Jun 28, 2016 at 1:48 PM
Edited Jun 28, 2016 at 1:50 PM
Yes I did and it seems to work well on the devices I have tested it on.

Yes I probably can provide you the test application.

I am currently waiting on some information from Xamarin in relation my Visual Studio setup / license agreement and once they have responded to me and I can determine exactly how to proceed in that aspect.

If nothing else I will be able to email you a zip file with the code, just send me an email and I will respond with it.

I will also take a look into writing the file from Android and get back to you.
Marked as answer by juliusfriedman on 6/28/2016 at 6:49 AM
Coordinator
Jun 28, 2016 at 2:11 PM
I responded, please let me know if there is anything else I can do for you.

I will see about making the TestApp public with the file writing ability after I have a response from Xamarin.
Marked as answer by juliusfriedman on 6/28/2016 at 7:11 AM
Jun 28, 2016 at 5:55 PM
Edited Jun 28, 2016 at 5:57 PM
I'm testing now with android and direct decoding....

I downloaded the the newest sln from your project. Just added the android Project. Build fine. But now after starting rstp connection, I get many "System.Net.Sockets.SocketException: Protocol option not supported" exceptions (in SocketExtensions.SetUnicastPortReuse) and one "System.Net.Sockets.SocketException: Connection refused" in RtpClient.ReceiveData(.

I changed nothing, so my first question: Is this actually running at your side? Or do I have to modify sth?
Coordinator
Jun 28, 2016 at 6:08 PM
Edited Jun 28, 2016 at 6:09 PM
Yes I am quite sure in 112162 the 'TestApp' worked fine on at least 3 different phones, 2 Samsung and a ZTE

I wouldn't tell you I have verified it and sent it to you if I didn't know personally that it was working first hand.

I have only made some small changes since then but they shouldn't effect anything although I also haven't built using the new assemblies on Android so I honestly can't say but I am fairly sure...

One thing I am aware of is that under Android it appears that TCP sockets cannot have re-use-address applied at all, they will not be able to send or receive if that option is set where as normally it would just not be set on the socket.

If this is what is happening then you will need to pass a different ConfigureRtp / Rtsp socket delegate or you will need to step through and see where the exception happens and not apply that option which causes it.

I am pretty sure this revision in 'ConfigureRtspSocket' will allow it work
//Xamarin's .net implementation on Android suffers from nuances that neither core nor mono suffer from.
        //If this option fails to be set there the socket can't be used easily.
        if (false.Equals(Media.Common.Extensions.RuntimeExtensions.IsAndroid))
        {
            //Ensure the address can be re-used
            Media.Common.Extensions.Exception.ExceptionExtensions.ResumeOnError(() => Media.Common.Extensions.Socket.SocketExtensions.EnableAddressReuse(socket));

            //Windows >= 10 and Some Unix
            Media.Common.Extensions.Exception.ExceptionExtensions.ResumeOnError(() => Media.Common.Extensions.Socket.SocketExtensions.EnableUnicastPortReuse(socket));
        }
let me know and I will update the code or you will have to wait until I can get a few minutes to rebuild the test app and let you know what the issue is or if I can even replicate it.
Marked as answer by juliusfriedman on 6/28/2016 at 11:08 AM
Jun 28, 2016 at 8:12 PM
Your isAndroid is not working, I comment it out and 3-4 more ResumeOnError(() =>)...

Now I get an exception in H264DEcoder.ProcessDecoderOuput at codec.Configure(formatNew, _surface, null, MediaCodecConfigFlags.None): Java.Lang.IllegalStateException?

I removed this line too. Now the BigBuckBunny video is streamed. But with as huge amount of artefacts, 80% is covered with artefacts... SO video is unwatchable.

As i mentioned no code changes, just your code with 112165. Maybe you could test the test app on your side?
Coordinator
Jun 28, 2016 at 9:34 PM
1)

I just ran the latest release version on Android and I had no issues connecting or with exceptions even with the IsAndroid block being entered, as I stated before I didn't think it made a difference but I was aware that certain options cannot be set on sockets in Android as they when they fail the socket is usable until the option is removed.

I advised about the possible solution and I even updated the code to ensure that I didn't have any issues.

2) You can pass your own ConfigureRtspSocket or ConfigureRtpSocket delegate, I provide those for convenience only, if they don't work for you then you will need to configure your socket appropriately as you require it.

Possibly setting a larger receive buffer or whatever other options you might require would also help in your scenario.

3) If your getting an illegal state exception it's because the SPS and PPS are not valid for some reason.

4) You must configure the Codec instance with the Surface or you can't decode to the Surface, if you remove that Configure call and still get video then where is the Surface assigned and where is the output rendered to? I don't think that DequeueOutputBuffer will even give you any output buffers unless there is a valid Surface for it to work with.

You must be using the OutputFormatChanged to assign the surface then?

5) I see some artifacts in the video also, it gets better or worse depending on various things but I don't think that is the point....

You asked for an Example application not a Media Player....

The decoder is probably not complete and the view is definitely not complete, I used DropVid combined with the existing code which was already being used to simply verify the RtspClient and RtpClient works and that they can in turn work with MediaCodec.

This is proved and verified; I didn't spend much time on the UI or improving the buffering mechanism at all.

The decoder should probably Flush at certain points to ensure that there are less artifacts if possible and should also attempt to enforce the framerate when decoding...

Finally the artifacts are called error concealment and usually only appear when the stream is corrupt, packets are missing or the decoder input was queued and de-queued for decoding before the entire frame data was available causing other buffers to be discarded.

At this point I have provided you with both improvements you could make to the WPF App and a TestApp for Android which also shows the Rtp and Rtsp client are working as expected.

In relation to writing the output h264 file, You stated previously that you had had another version working but after updating to the latest version you have issues.. What version worked for you? No other version since then works? Is that code which was working using the same code as was provided in 'TestCode.cs'?

The later versions of this library are ideally more stable than the previous versions and offer better performance so that further indicates that there is something wrong at the network or application level.

The other thing which pretty much solidifies that opinion for me is that you have the same application layer output using threads or not... this can't be possible unless the application itself is throttling the data to the file in some way either due to GC or otherwise.

Furthermore it proves that the application is ordering the packets correctly in both threaded an non threaded cases otherwise the output would be different.

I hope you understand please let me know how I can help you further!

Sincerely,
Julius
Marked as answer by juliusfriedman on 6/28/2016 at 2:34 PM
Jun 29, 2016 at 6:40 AM
Thanks, much to read. I must spend some time for all of it.

But how can System.Type.GetType("Android.OS") ever returns sth.? "Android.OS" is just a namespace. In emulator and on my device it is always null. This is absolutely new to me ;-)
Coordinator
Jun 29, 2016 at 2:40 PM
Sorry for the long winded response I just wanted it to be as complete as possible.

Xamarin in changing rapidly so we are all 'new to this'... also there is really nothing to be new to unless you have to interact with the UI or Model Specific Features which is where most of the Java Style API comes in and why Xamarin did things the way they did them...

Now that Microsoft owns Xamarin a lot will be changing...

Right now when you are running your application through Visual Studio you are executing Mono and not the MS CLR on Android and you are doing so through a VM which is on Linux (Even on your handset). They supposedly use an advanced JIT but in what way its more advanced over Mono or the CLR remains to be seen..

iOS is different and arguably the better way of the two because it provides a way to compile the IL directly to Machine Code which is executed on the handset.

I haven't taken much time to support Xamarin before because it wasn't free, it also provided no immediate benefit over reducing development time at the cost of performance since the either C or Java application would obviously run better than a emulated x86 application even with an advanced JIT...

To answer your question...

It would have to be defined somewhere...

Go ahead and try to verify any of types required outside of the assembly which is run in the Activity, they are in a different AppDomain and thus cannot be accessed easily.

Verify with System.Type.GetType("Android.OS.Signal") which will return you something in the AppDomain where your Activity is not outside of it.

I have verified IsMono works on the Handset so obviously I am doing something right... What you and others need to understand that the library needs to work on Mono OUTSIDE of Xamarin just as well as it does in Xamarin for those who need it.

Its not that hard to define a Type which corresponds to 'Android.OS' and combine that with #if or a partial class when required; although if you do that in the Activity you won't be able to access Android.OS unless you use an alias or are careful how you declare the type.

I hope this answers your question...
Marked as answer by juliusfriedman on 6/29/2016 at 7:40 AM
Coordinator
Jul 5, 2016 at 10:48 PM
I have personally verified there are no artifacts in the latest release when used in conjunction with the 'TestApp' on Android.

The problem was in the RFC6184Frame class...

Let me know if your still having issues.
Marked as answer by juliusfriedman on 7/5/2016 at 3:48 PM
Jul 6, 2016 at 3:54 PM
Edited Jul 7, 2016 at 5:52 AM
Yes, I can confirm with the new version it looks where fine. My file save test runs perferctly now (With bunny video and my cameras!). With the Android App ("Test App") and the decoder I still have some issues, but I have to spend more time. But it is also running much better. Did you change something in the "Test App"?

But thanks to the last update. Nice work!

What about the new ncode project? I didn't see a sample. Is this working with h264 decoding already and do you have a sample? First I thought it is cscodec, but it isn't.
Coordinator
Jul 6, 2016 at 5:19 PM
Edited Jul 6, 2016 at 5:20 PM
Nothing in TestApp really should need to be changed... (Ideally) Besides the gray spots in the API (attaching connect when already attached etc)

Keep in mind that app was quickly thrown together to support and address issues found with the library and is not really polished.

The H264 Decoder class I changed a few things and I get good playback under TCP and UDP but even in DEBUG with all the logging it seems to work well, I need to test more phones and environments to make 100% and I hope other people provide other feedback which will also assist in verifying this.

I would also like to add Audio support and Playback controls to the application but I have the library to worry about first... I really need to revise a few things in the RtspClient and RtpClient and complete Http Tunneling support....

'nCodec' was a port of jCodec I did about 2 or 3 years ago now, it was working at the time to the level that same jCodec version was and was actually made to be slightly faster by chaining the layout of the Picture class to use a single dimensional array rather than a multidimensional array. The jCodec code is heavily based on ffmpeg / libav and thus where some of the similarities come in with csCodec.

I have since moved to my own implementation of Image and Picture and their constituents which offers more meta information about the Format, cleaner code for conversions and the ability to define formats which are aligned to offsets of unused space on other formats and eventually will utilize SIMD where required or other Intrinsics to provide hardware acceleration through the CPU or an external device where required.

nCodec would thus be a built library which references the libraries required to provide the best performance and support on top of Media

1) nCodec
- Common, Codec, etc
2) Application
-nCodec

The design currently allows you to swap out nCodec to target other versions of Common, Codec or everything nCodec contains by replacing only nCodec.

This indirection is useful for various things and also allows someone two different versions of a single library to be used from Application if required for some reason.

As for a sample... I don't have much because I didn't do much with it other than port it..

Check http://jcodec.org/ which may help you but remember I have changed the API slightly in some instances.

There also were some issues after the change to single dimensional arrays I never really fixed, you can either revert the types required in those cases or fix the bugs where the old code created multi-dimensional arrays and now creates single dimensional arrays.

The more feedback I get the more work I will put into the library.

What I am concerned with at this point is the API, any constructive criticism you can provide about what gives you a hard time or what you don't understand would also help me to revise and provide something better.
Marked as answer by juliusfriedman on 7/6/2016 at 10:19 AM
Jul 7, 2016 at 6:26 AM
Edited Jul 7, 2016 at 6:45 AM
Yes of course I can and will provide better feedback. First I have to do other things at work. But I'am planning to use your System in our applications.

I think one thing you should really consider is to provide a better sample system (Specially for people like me(absolut noob to rtsp/video)). Of course you can find many infos in the discussions (and your unittest project), but just for that I had to spend much time to read through all posts.

In the last week I got 4 E-Mail questions to your libary from different people who want to use your libary but struggled with the lack of samples.

Another big thing is decoding I think. I understand that is not your main goal and because of 100s of different codecs it would be a huge fight, but just some working samples in your libary would bring it to the next level.

For decoding: For me it is hard to estimate which way is the best to go. In the discussions often mentioned is cscodec, now you have ncodec. On Xamarin.Android medicodec, then ffmpeg binding and vlc binding...