The method or operation is not implemented. ... System.Net.NetworkInformation.UnixIPGlobalProperties.GetActiveUdpListeners

Topics: Bug Archive
Feb 3, 2016 at 1:30 PM
Edited Feb 3, 2016 at 1:30 PM
Hello,

thanks a lot for Managed Media Aggregation, it Looks extremly useful, I hope to get it working soon. This would be a great help for my app.

I am not sure if it is a bug. I have an SIP Connection and start the RTP Client from this solution. If I set have no media info, it connects okay.

I first had incomplete media info, then it told me:
Invalid Media Description

I understand that. but now everything Looks okay after I added media info and when I set my RTPClient to Client.Activate()

I get an exception:

The method or operation is not implemented. ... System.Net.NetworkInformation.UnixIPGlobalProperties.GetActiveUdpListeners

The complete error message is attached at the end.
When searching for GetActiveUdpListeners, I found something in the "Common" Project. I already attached it as a reference to my Project.

My Project is written in C#, Xamarin 4.0. I am testing it using an Android device which is attached via USB to my PC (Visual Studio 2015).

Is it a bug or maybe just something missing in your solution, or do you have a hint for me how to solve this Problem?

Thanks a lot!

Eric


{System.NotImplementedException: The method or operation is not implemented.
at System.Net.NetworkInformation.UnixIPGlobalProperties.GetActiveUdpListeners () [0x00000] in /Users/builder/data/lanes/2512/d3008455/source/mono/mcs/class/System/System.Net.NetworkInformation/IPGlobalProperties.cs:149
at Media.Common.Extensions.Socket.SocketExtensions.FindOpenPort (ProtocolType type, Int32 start, Boolean even) [0x00029] in C:\Projekte\net7mma-111867\Common\Extensions\SocketExtensions.cs:76
at Media.Rtp.RtpClient.FromSessionDescription (Media.Sdp.SessionDescription sessionDescription, Media.Common.MemorySegment sharedMemory, Boolean incomingEvents, Boolean rtcpEnabled, System.Net.Sockets.Socket existingSocket, Nullable1 rtpPort, Nullable1 rtcpPort, Int32 remoteSsrc, Int32 minimumSequentialRtpPackets) [0x000d4] in C:\Projekte\net7mma-111867\Rtp\RtpClient.cs:276
at SKSSipClient.ViewModels.IncomingViewModel.InitSession (Independentsoft.Sip.Sdp.SessionDescription session) [0x00099] in ...}
Feb 3, 2016 at 2:03 PM
I must add:

it is not in Client.Activate. It is in my step before that:

RtpClient rtpClient = Media.Rtp.RtpClient.FromSessionDescription(sessionDescription);

Here I already get the exception mentioned above.

Thanks!
Coordinator
Feb 4, 2016 at 1:31 PM
Edited Feb 4, 2016 at 10:59 PM
Thanks for your interest and kind words.

Can you post a copy of the session description so I can take a look?

I think the issue is with Mono so please make sure the method is available that your trying to use in the Mono framework you have installed.

If you need anything else let me know!
Marked as answer by juliusfriedman on 2/4/2016 at 5:31 AM
Feb 5, 2016 at 8:47 AM
Hello Julius,

thanks a lot for your reply.

I have a Google Nexus 7 with Android 5.0 connected to my PC via USB. And it can connect to my SIP/RTP Gateway via WLAN.

On my PC I use Visual Studio 2015 to start and debug my Client. This is a Xamarin 4.0 solution and tested using the android target on my Nexus.
The SIP (using Independentsoft.SIP API) works, then I start my RTP Connection:

{v=0
o=sdm 2890844526 2890844526 IN IP4 127.0.0.1
s=SIP Call
c=IN IP4 192.168.1.114
t=0 0
m=video 9078 RTP/AVP 96 97 98
a=rtpmap:96 H264/90000
a=rtpmap:97 H264/90000
a=rtpmap:98 H264/90000
}

Since the sessiondescription is different, I map it like this (session is what I get from the SIP Client after I connected, registered and then get an incoming Audio Video call):
            Media.Sdp.SessionDescription sessionDescription = new Media.Sdp.SessionDescription(session.ToString());
            sessionDescription.SessionName = session.Name;
            sessionDescription.SessionId = session.Owner.SessionID.ToString();
Session.ToString() is the part above for starting the rtp Connection.


After that:
            RtpClient rtpClient = Media.Rtp.RtpClient.FromSessionDescription(sessionDescription);

            rtpClient.RtpPacketReceieved += (sender, rtpPacket, tc) => { ClientLogger.Instance.Log("Got a RTP packet, SequenceNo = " + rtpPacket.SequenceNumber + " PayloadType = " + rtpPacket.PayloadType + " Length = " + rtpPacket.Length); };
            rtpClient.RtpFrameChanged += (sender, sender2, rtpFrame, tc) => { ClientLogger.Instance.Log("Got a RTP Frame PacketCount = " + rtpFrame.CurrentFrame.Count + " Complete = " + rtpFrame.CurrentFrame.IsComplete); };
            rtpClient.RtcpPacketReceieved += (sender, rtcpPacket, tc) => { ClientLogger.Instance.Log("Got a RTCP packet " + " Type=" + rtcpPacket.PayloadType + " Length=" + rtcpPacket.Length + " Bytes = " + BitConverter.ToString(rtcpPacket.Payload.Array)); };

            rtpClient.Activate();
I get the not implemented exception here: RtpClient rtpClient = Media.Rtp.RtpClient.FromSessionDescription(sessionDescription);

Any idea how I can solve that Problem? Or maybe there is a Workaround?

Thanks a lot, help would be great :-)

Eric
Coordinator
Feb 5, 2016 at 7:47 PM
Check out the implementation from MOSA @ https://github.com/mosa/Mono-Class-Libraries if the function isn't implemented in your version of Mono.

Other than that you would need to use another utility to get the information and then identify if the port was in use.

Let me know if you need anything else!

-Julius
Marked as answer by juliusfriedman on 2/5/2016 at 11:47 AM
Feb 8, 2016 at 9:55 AM
Edited Feb 8, 2016 at 9:57 AM
Hello Julius,

I also created a Xamarin Thread here:
https://forums.xamarin.com/discussion/comment/179336?
they recommended me some socket library but I don't how how it should help, because I am not deep enough into this Topic..

I made a "hack" in your library:

In your SocketExtensions.cs I added a
Try {all code here}
        catch
        {
            //ToDo test
            return 30000;
        }
At least I get no exception anymore here. then I had to Change this, because I also got a not implemented exception in your IPAdressExtensions.cs (MapToIPv6 is not found):
    //ToDo test
    //private static System.Net.IPAddress emptyIpv6 = emptyIpv4.MapToIPv6();
    //private static System.Net.IPAddress intranetMask1v6 = intranetMask1v4.MapToIPv6();
    //private static System.Net.IPAddress intranetMask2v6 = intranetMask2v4.MapToIPv6();
    //private static System.Net.IPAddress intranetMask3v6 = intranetMask3v4.MapToIPv6();
    //private static System.Net.IPAddress intranetMask4v6 = intranetMask4v4.MapToIPv6();
    private static System.Net.IPAddress emptyIpv6 = System.Net.IPAddress.Parse("0.0.0.0");
    private static System.Net.IPAddress intranetMask1v6 = System.Net.IPAddress.Parse("10.255.255.255");
    private static System.Net.IPAddress intranetMask2v6 = System.Net.IPAddress.Parse("172.16.0.0");
    private static System.Net.IPAddress intranetMask3v6 = System.Net.IPAddress.Parse("172.31.255.255");
    private static System.Net.IPAddress intranetMask4v6 = System.Net.IPAddress.Parse("192.168.255.255");
At least I get it started now on my Android device. Of course it would be great if you find a better solution for my hack, if you want to improve your great library for working on Xamarin :-)

Thanks a lot,

Eric
Coordinator
Feb 8, 2016 at 8:01 PM
I think what they are saying is that the method is implemented but not for your platform.

They recommend that library because it offers a way for you to use the method which works for your platform.

I will take a look and see if theres any other way to call the method on Android when I get a chance later in the week.
Feb 9, 2016 at 9:02 AM
Perhaps this might help:

https://www.nuget.org/packages/rda.SocketsForPCL/

It seems to have similar functions for Xamarin for both UDP and TCP Listeners. I think this is also the cause for my later port Problems.

Thanks a lot Julius!
Marked as answer by juliusfriedman on 2/11/2016 at 12:08 PM
Coordinator
Mar 11, 2016 at 5:37 PM
I also added ProbeForOpenPort to support this without an implementation of those methods.

Thanks again for bringing this up.
Marked as answer by juliusfriedman on 3/11/2016 at 9:37 AM
Mar 14, 2016 at 1:24 PM
Hello Julius,

I was ill for a while but now I am back again.
I got a new Version of the Server / Gateway and downloaded the latest Version of your package today, thanks a lot for it!

I will add some Feedback later.
Marked as answer by juliusfriedman on 3/14/2016 at 6:51 AM
Mar 14, 2016 at 4:23 PM
Hello Julius,

We are starting an Incoming call (I am the Client, the Gateway sends a call).

we are receiving the "Invite" (from SIP Gateway) with an Incoming Media Session Description.

I want to answer with a Ringing and an OK Response.

Then I get an ACK an start my RTPClient.

It seems to work, but I still receive no RTP data from the Gateway.

Can you tell me, when to use the Incoming Media Session Description and when my OWN Media Description?

Especially when I create my RTPClient with RTPClient.FromMediaDescription?


Thanks a lot,
Firlefanz
Coordinator
Mar 14, 2016 at 4:40 PM
You should always be using the sdp from the gateway unless you are announcing media to the gateway then you would create your own.

In your example once you recieved the sdp the FromSessionDescription method should work fine.

You can verify if the gateway sends you packets by looking in Wireshark to see any rtp packets which are sent out from the gateway and successfully arrive at your computer.

If you see packets arriving then their source and destination ports should match that given by the invite sdp.

I would only be suprised if you saw packets arrive in wireshark and didn't see them in the RtpClient because that would indicate a problem in the library.

In the past from what I remember looking at in your captures you never recieved data from the correct ports, the data seemed to be multicasted when you were expecting unicast indicating some type of routing issue.

Let me know if that answers your question and if you need further help
Marked as answer by juliusfriedman on 3/14/2016 at 8:40 AM
Mar 15, 2016 at 10:01 AM
Hello Julius,

something was wrong with the session description and is fixed now.

I can see the Server / Gateway now sending both Video and Audio packets for the first time. :-)

But I cannot receive them...

I wonder in your RTPClient.cs why both handlers for receiving data are only set one time and both commented only:
        //RtpPacketReceieved += new RtpPacketHandler(HandleIncomingRtpPacket);
        //RtcpPacketReceieved += new RtcpPacketHandler(HandleIncomingRtcpPacket);
How do I receive RTP packages? I am using:

RtpClient rtpClient = Media.Rtp.RtpClient.FromSessionDescription(sessionDescription);

rtpClient.RtpPacketSent += OnSourceRtpPacketSent;
rtpClient.RtcpPacketSent += OnSourceRtcpPacketSent;
rtpClient.RtpPacketReceieved += OnSourceRtpPacketRecieved;
rtpClient.RtpFrameChanged += OnSourceFrameChanged;
rtpClient.RtcpPacketReceieved += OnSourceRtcpPacketRecieved;

rtpClient.Activate();
I am currently investigating. thanks a lot again for your help!
Coordinator
Mar 15, 2016 at 12:25 PM
Edited Mar 15, 2016 at 12:26 PM
The client handles all incoming packets and raises events for them, it doesn't subscribe to its own event because it completely handles it's own logic without events. This can be disabled with the properties HandleIncomingRtpPackets etc.

Infact sending should also be the same style, if it's not then I will take a look to ensure it matches.

If it was the other way around the client would handle packets at the same time as event subscribers and have no way to cancel the event without adding an additional state value to the event signature and have no way to handle those same packets if the events were disabled.

I'm fairly sure your problem is going to be with the recieve ports if anything, can you send me a small capture with Wireshark so I can verify?

If you see rtp packets coming in then they should have a source and destination port which matches that in the Invite and SDP and the address should be the same as indicated in the SDP Invite as well.

If it's not then depending on how the gateway sends the packets to you, you will have to either join the multicast group beforehand or set the RemotePort to 0 to allow any port.

Let me know if that doesn't make sense or if you have anymore questions.
Marked as answer by juliusfriedman on 3/15/2016 at 4:25 AM
Coordinator
Mar 15, 2016 at 2:45 PM
I have updated the RtpClient to do the same for sending as receiving so it's less confusing of a pattern to follow.

Please let me know if your still having trouble and thanks for bringing that up!
Marked as answer by juliusfriedman on 3/15/2016 at 6:45 AM
Mar 15, 2016 at 4:30 PM
Hello Julius,

https://www.dropbox.com/s/c5hye7givpmqun4/AndroidClientWithMMA936.zip?dl=0

Right now it works as this:

Gateway / Server sends me first INVITE. It is 192.168.1.109

I save the sessiondescription from Invite into my variable ReceivedMediaSessionDescription.

Android Client answers with Ringing without a session description and also answers OK with my OWN sessiondescription generated, but using ID and Version received from Gateway for creating new sessiondescription. Client is 192.168.1.118

Gateway then sends an ACK.

Client now creates the RTPClient using FromSessionDescription. The Parameter is ReceivedMediaSessionDescription I remebered from the INVITE.

The Gateway starts sending an UDP stream, but I do not get it. Please see the ZIP with both Wireshark and Visual Studio Output attached.
It includes some console logs I added.

The intersting part in the rtpclient:

In the RTPClient.cs in void SendReceieve() I added this logging:
                            Console.WriteLine("SendReceieve Running TC rtpEnabled: " + rtpEnabled);
                            Console.WriteLine("SendReceieve Running TC Socket.IsConnected: " + tc.RtpSocket.Connected);
                            Console.WriteLine("SendReceieve Running TC Socket.IsBound: " + tc.RtpSocket.IsBound);
                            Console.WriteLine("SendReceieve Running TC m_ReceiveInterval:" + tc.m_ReceiveInterval);
                            Console.WriteLine("SendReceieve Running TC (shouldStop || IsDisposed || m_StopRequested): " + (shouldStop || IsDisposed || m_StopRequested));

                            //If receiving Rtp and the socket is able to read
                            if (rtpEnabled && false == (shouldStop || IsDisposed || m_StopRequested)
                            //Check if the socket can read data
                            //&& tc.RtpSocket.Available > 0 /* 
                            && tc.RtpSocket.Poll((int)(Math.Round(Media.Common.Extensions.TimeSpan.TimeSpanExtensions.TotalMicroseconds(tc.m_ReceiveInterval) / Media.Common.Extensions.TimeSpan.TimeSpanExtensions.MicrosecondsPerMillisecond, MidpointRounding.ToEven)), SelectMode.SelectRead))
                            //&& tc.RtpSocket.Poll((int)Media.Common.Extensions.TimeSpan.TimeSpanExtensions.TotalMicroseconds(tc.m_ReceiveInterval), SelectMode.SelectRead))
                            {
                                //Receive RtpData
                                Console.WriteLine("TryReceiveData: ");
                                receivedRtp += ReceiveData(tc.RtpSocket, ref tc.RemoteRtp, out lastError, rtpEnabled, duplexing, tc.ContextMemory);
                                Console.WriteLine("ReceiveData received: ");
As you can see in the visual Studio Output it never reaches "TryReceiveData", it does not survice the poll.
I also tried to put a larger number into the poll (I removed the divider) but it also did not help.

I cannot find the remaining error that could be the port still?

Do you have any idea? When I just remove the Poll here, I get an exception after it....

Thanks a lot again :-)
Coordinator
Mar 15, 2016 at 9:43 PM
Edited Mar 15, 2016 at 9:49 PM
Your almost there.

You don't need to continue adding those log statements to the Console, the Logger should be sufficient for your to output data and thus you should use that and not the Console.

I will add a specific log for not being able to poll in the next release but I don't think we need to have all that data, just a simple 'Media.Common.ILoggingExtensions.Log(Logger, InternalId + "Unable to Poll RtpSocket");' should do.....

If Poll is not returning then you can't read the socket, it's not wise to remove the poll calls especially since the socket can be in use from other threads.

There is a problem with your Gateway as I stated before, it says the packets should arrive on port 27014 or 24398, your Video packets arrive on port 9078 from port 48537, and your Audio packets arrive form 7078 to 7078.... that is WAY OFF.

The reason why it is way off though is because your sending back a SDP which says that you will use port 7078 and 9078 when actually those are not the ports you will use, you should provide the ports that you have determined to already be open when you send back the SDP.

This will make the Gateway send the packets on the correct port.

What you could normally do in cases such as this is just assign the RemoteRtp port and RemoteRtcp port to 0, that would allow the destination port not to matter and to be discovered on the first receive and be stored thereafter but the problem in this case is that even if you do that the RtpClient will bind to a different port than you need it to and this is because you are not telling the Gateway the correct ports to use.

It seems you receive 2 Invites, one which is acknowledged and contained invalid ports and another which is never acknowledged, this second INVITE contains the correct ports for the Audio, the reason for the multiple invites is that the Gateway can't reach you on the ports you specified so it attempts to INVITE again and never gets a response so it just keeps trying on the same ports.

In short, when you send back your OK response to the Gateway you should be modifying the SDP to use the ports you expect to receive on.

I will show you below exactly what I mean.
@k@`n8AEQ@vm SIP/2.0 200 OK
Via: SIP/2.0/UDP 192.168.1.109:5060;branch=z9hG4bK07f952e6
From: <sip:sipclient@192.168.1.109>;tag=as52ab7c65
To: <sip:test@192.168.1.118:49952>;tag=766016738
Call-ID: 76795c6f7a4cb294634b9eb77f7c11b4@192.168.1.109:5060
CSeq: 102 INVITE
Contact: <sip:test@192.168.1.118:49952>
Content-Type: application/sdp
Content-Length: 216
User-Agent: SIP .NET 1.0 evaluation version, www.independentsoft.com

v=0
o=test 727532823 727532823 IN IP4 192.168.1.118
s=SIP Call
c=IN IP4 192.168.1.118
t=0 0
m=audio 7078 RTP/AVP 8 0 <----- This line is wrong, the first part should be what port you expect to listen on
a=rtpmap:8 PCMA/8000
a=rtpmap:0 PCMU/8000
m=video 9078 RTP/AVP 99 <----- This line is wrong, the first part should be what port you expect to listen on
a=rtpmap:99 H264/90000
If you can't change that for some reason using their software then you will have to adjust the RtpClient to use the ports indicated by passing the rtpPort and rtcpPort to the FromSessionDescription function.

Finally, it seems you will be using for a SIP Client from independentsoft, I don't want to take any of their business away but if you were to donate to this project it would be a tax deduction for your business and additionally I would provide full support for SIP. It seems 100 Euro is a little steep for just the Sip protocol implementation without any Rtp.

Let me know if you want to consider this option and I will begin to work more on the SIP implementation which will resemble the RtspClient / HttpClient implementation.




I also updated the code @ 111945 so you can see if you were unable to Poll, thanks for bringing that up!
Marked as answer by juliusfriedman on 3/15/2016 at 1:45 PM
Mar 16, 2016 at 9:02 AM
Hello Julius,

that was great! I am using RTPPort 0 and RTCPPort 0 now when initializing the rtpclient from mediasession.
And for Generating my own media session I am using remembered ports from the transportcontext now.

I used your new 945 and added These for me to remember:

I added These two properties to the transportcontext in rtpclient.cs around line 665:

public int Localport { get; private set; }
public int Remoteport { get; private set; }

and set both in transportcontext.initialize Basic method in try block around line 1467:

Localport = localRtp.Port;
Remoteport = remoteRtp.Port;

Thanks a lot, that was great Support!

I will talk to my Company if it is possible to donate. I hope and think so (if not I will donate you a few beers from private).
This RTPClient really helps us a lot (that should be reason enough) and the SIP from same open source library would be good.

I will tell you later.

Thanks a lot for your library and your Support!

Firlefanz
Coordinator
Mar 16, 2016 at 12:47 PM
Why do your need to add local port and remote port?

Can't you just create an object of your choosing and put it on the ApplicationContext property if you really need the state..

There is already RemoteRtp and LocalRtp peoperties, I am not sure why you would need those additional properties.

Anyway glad your all squared away now.

Hopefully we will talk again soon about SIP support.

Cheers!
Marked as answer by juliusfriedman on 3/16/2016 at 4:47 AM
Coordinator
Apr 2, 2016 at 3:40 PM
Fyi, Sip message should work.

There isn't a client yet but that will be coming.

Also maybe SAP, let me know if it helps!
Apr 2, 2016 at 4:40 PM
Maybe SIP will be useful for me later. Right now I need to Play the Video, I have very big Problems playing back the H264 from RTP stream on my xamarin Android device...

But thanks a lot for letting me know and have a nice Weekend!
Coordinator
Apr 2, 2016 at 4:46 PM
Media Codec should do that for you in Android and Videotoolkit or CoreVideo in iOS.

If there's anything I can do let me know.
Marked as answer by juliusfriedman on 4/2/2016 at 8:46 AM
Apr 4, 2016 at 9:11 AM
Hello Julius,

thanks (again and again) a lot for your help! :-)

I have a new Hardware camera and Gateway now since the old ones did not work correct.

I am trying to split Video and Audio now when receiving data in your RTPClient:
        internal void OnSourceRtpPacketRecieved(object client, RtpPacket packet = null, RtpClient.TransportContext tc = null)
        {
            if (packet != null)
            {
                ClientLogger.Instance.Log("Got a RTP packet, SequenceNo = " + packet.SequenceNumber + " PayloadType = " + packet.PayloadType.ToString() + " Length = " + packet.Length);
                byte[] decodePack = new List<byte>(packet.PayloadData).ToArray();
                if (tc.MediaDescription.MediaType == MediaType.video)
                {
                    if (VideoDataReceived != null)
                        VideoDataReceived(this, new DataEventArgs() { Data = decodePack });
                }
                else if (tc.MediaDescription.MediaType == MediaType.audio && packet.Length > 100)
                {
                    if (AudioDataReceived != null)
                        AudioDataReceived(this, new DataEventArgs() { Data = decodePack });
                }
            }
            else
            {
                ClientLogger.Instance.Log("NULL - RTP packet received");
            }
        }
I am using an Android audiotrack to Play back Audio, it is deleayed but works. My Problem is still the Video...

Thanks for your hint I was already trying the mediacodec class (after just opening the Video stream with a videoview or mediaplayer did not work)...
Apr 4, 2016 at 9:16 AM
Edited Apr 4, 2016 at 9:17 AM
For my own H264Decoder I wanted to use SPS and PPS Header first to decode. But they are not in the session description (anymore), when activating the RTPClient after FomSessionDescription, the SDP Looks like this:

v=0
o=root 969084384 969084384 IN IP4 192.168.1.132
s=Asterisk PBX GIT-master-60a15fe
c=IN IP4 192.168.1.132
b=CT:384
t=0 0
m=audio 11768 RTP/AVP 0 8
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=maxptime:150
a=sendrecv
m=video 27630 RTP/AVP 99 34
a=rtpmap:99 H264/90000
a=rtpmap:34 H263/90000
a=sendrecv

There is no sprop or something. When I asked the camera and Gateway manufacturer he told me he used a wireshark log, extracted the udp stream, then used GStreamer like this:
gst-launch-1.0 udpsrc port=5008 caps="application/x-rtp" !
rtph264depay ! h264parse ! matroskamux ! filesink location=test.mkv
After that he got an mkv file which can be watched using VLC Player for example. And he told me resolution is 640x480.
That's the info I got.
Apr 4, 2016 at 9:24 AM
My own H264 Decoder Looks like this after searching a lot in Internet:
        public H264Decoder(Surface surface, int width, int height)
    {
        _codec = MediaCodec.CreateDecoderByType("video/avc");
            _format = MediaFormat.CreateVideoFormat("video/avc", width, height);
            //_format = new MediaFormat();
            //_format.SetInteger(MediaFormat.KeyMaxInputSize, width * height);
            _format.SetInteger(MediaFormat.KeyWidth, width);
            _format.SetInteger(MediaFormat.KeyHeight, height);
            _format.SetInteger(MediaFormat.KeyMaxWidth, width);
            _format.SetInteger(MediaFormat.KeyMaxHeight, height);

            //_format.SetInteger(MediaFormat.KeyStride, 720);
            //_format.SetInteger(MediaFormat.KeySliceHeight, 576);

            //_format.SetInteger(MediaFormat.KeyPushBlankBuffersOnStop, 1);
            _surface = surface;
        }
Each packet then calls These two, the first block is being used only the first time (but I do not have any SPS or PPS so this does not work at all):
    public void Start()
        {
            if (!_isRunning)
            {
                if (RTPClientManager.Instance.Parameters.ContainsKey("H264"))
                {
                    if (RTPClientManager.Instance.Parameters["H264"].ContainsKey("sprop-parameter-sets"))
                    {
                        String[] spsString = RTPClientManager.Instance.Parameters["H264"]["sprop-parameter-sets"].Split(',');
                        try
                        {

                            byte[] sps = Base64.Decode(spsString[0], Base64Flags.Default);
                            _format.SetByteBuffer("csd-0", ByteBuffer.Wrap(sps));
                            byte[] pps = Base64.Decode(spsString[1], Base64Flags.Default);
                            _format.SetByteBuffer("csd-1", ByteBuffer.Wrap(pps));
                        }
                        catch (Exception er)
                        {

                        }
                    }
                }                

                _codec.Configure(_format, _surface, null, 0);

                _codec.Start();
                _isRunning = true;
                _isConfigured = false;
            }   
    }
And the decode each packet:
        public void Decode(byte[] data)
        {
            int fragment_type = data[0] & 0x1F;
            int nal_type = data[1] & 0x1F;
            int start_bit = data[1] & 0x80;

            bool isIFrame = (((fragment_type == 28 || fragment_type == 29) && nal_type == 5 && start_bit == 128) || fragment_type == 5);

            Console.Out.WriteLine("fragment_type: {0} nal_type: {1} start_bit: {2} isIFrame: {3}", fragment_type, nal_type, start_bit, isIFrame);

            try {    
                int inputBufferId = _codec.DequeueInputBuffer(10000);
                if (inputBufferId >= 0)
                {
                    ByteBuffer inputBuffer = _codec.GetInputBuffer(inputBufferId);
                    // fill inputBuffer with valid data
                    inputBuffer.Clear();
                    inputBuffer.Put(data,0, data.Length);
                    MediaCodecBufferFlags flags = MediaCodecBufferFlags.None;
                    if (isIFrame)
                    {
                        flags &= MediaCodecBufferFlags.KeyFrame;
                    }
                    else {
                        if (!_isConfigured)
                        {
                            flags &= MediaCodecBufferFlags.CodecConfig;
                            _isConfigured = true;
                        }
                    }
                    _codec.QueueInputBuffer(inputBufferId, 0, data.Length, 33, flags);
                }
                MediaCodec.BufferInfo buffInfo = new MediaCodec.BufferInfo();
                int outputBufferId = _codec.DequeueOutputBuffer(buffInfo, 100000); // 10000
                if (outputBufferId == (int)MediaCodecInfoState.TryAgainLater)
                {
                    // no output available yet 
                    System.Diagnostics.Debug.WriteLine("no output from decoder available");
                    //if (VERBOSE) Log.d(TAG, "no output from decoder available"); 
                }
                else if (outputBufferId == (int)MediaCodecInfoState.OutputBuffersChanged)
                {
                    // The storage associated with the direct ByteBuffer may already be unmapped, 
                    // so attempting to access data through the old output buffer array could 
                    // lead to a native crash. 
                    System.Diagnostics.Debug.WriteLine("decoder output buffers changed");
                    //if (VERBOSE) Log.d(TAG, "decoder output buffers changed"); 
                    //decoderOutputBuffers = decoder.getOutputBuffers(); 
                }
                else if (outputBufferId == (int)MediaCodecInfoState.OutputFormatChanged)
                {
                    // this happens before the first frame is returned 
                    _format = _codec.GetOutputFormat(outputBufferId);
                    System.Diagnostics.Debug.WriteLine("decoder output format changed: " + _format);
                    //if (VERBOSE) Log.d(TAG, "decoder output format changed: " + decoderOutputFormat); 
                }
                else if (outputBufferId < 0)
                {
                    System.Diagnostics.Debug.WriteLine("unexpected result from deocder.dequeueOutputBuffer: " + outputBufferId);
                    //fail("unexpected result from deocder.dequeueOutputBuffer: " + decoderStatus); 
                }
                else
                {  // decoderStatus >= 0 
                    System.Diagnostics.Debug.WriteLine("surface decoder given buffer " + outputBufferId + " (size=" + buffInfo.Size + ")");
                    //if (VERBOSE) Log.d(TAG, "surface decoder given buffer " + decoderStatus + " (size=" + info.size + ")"); 
                    //rawSize += info.size; 
                    if ((buffInfo.Flags & MediaCodecBufferFlags.EndOfStream) != 0)
                    {
                        //if (VERBOSE) Log.d(TAG, "output EOS"); 
                        //outputDone = true; 
                    }

                    bool doRender = (buffInfo.Size != 0);

                    // As soon as we call releaseOutputBuffer, the buffer will be forwarded 
                    // to SurfaceTexture to convert to a texture.  The API doesn't guarantee 
                    // that the texture will be available before the call returns, so we 
                    // need to wait for the onFrameAvailable callback to fire. 
                    _codec.ReleaseOutputBuffer(outputBufferId, doRender);
                    if (doRender)
                    {
                        System.Diagnostics.Debug.WriteLine("awaiting frame ");
                        //if (VERBOSE) Log.d(TAG, "awaiting frame " + checkIndex); 
                        //assertEquals("Wrong time stamp", computePresentationTime(checkIndex), info.presentationTimeUs); 
                        //_surface.awaitNewImage();
                        //_surface.drawImage();
                        //if (!checkSurfaceFrame(checkIndex++)) { 
                        //  badFrames++; 
                        //} 
                    }
                }
            }catch(Exception ex)
            {

            }
    }
This one is always running into this

System.Diagnostics.Debug.WriteLine("no output from decoder available");


If you can tell me what I am doing wrong or if my idea is correct how to Setup the codec that would be great...
Coordinator
Apr 4, 2016 at 12:50 PM
Post a Wireshark capture from the new gateway so I can take a look.

It seems the sps and pps are in band in the stream and that's why they are not in the sdp but I can't really confirm that without the capture.

No output from decoder doesn't have anything to do with this library, I would try to decode a single access unit which is known to work with other decoders and determine from there.

The audio is working so video should work the same way, the x-rtp is also not really 100% supported yet but I will see about finishing support for it if you can provide me the capture which shows the semantics of the transport, I am doubting that it uses x-rtp at this time but we will see.
Marked as answer by juliusfriedman on 4/4/2016 at 4:50 AM
Apr 4, 2016 at 1:14 PM
Edited Apr 4, 2016 at 1:14 PM
Thanks:

https://www.dropbox.com/s/02hzdldjqt8845t/Wireshark-AndroidClient-NewGateway.zip?dl=0

Yes it seems I am getting a sps and pps paket in the stream.
Coordinator
Apr 4, 2016 at 1:49 PM
Okay this capture looks fine and is not using 'rtx' AFAIK.

What is your problem with this scenario exactly?
Marked as answer by juliusfriedman on 4/4/2016 at 5:49 AM
Apr 4, 2016 at 1:58 PM
I remembered the port in the Video stream SDP and tried to give the Server ip with Video port from SDP to the Android VideoView.
That told me "Cannot Playback Video" or something like this in german.

I tried Android Mediaplayer which crashes or hangs in Preare(Async) when I do.

So I searched how to decode it myself and wrote the class above. But it cannot produce a working Output it seems.


The source is a (raw) H264 stream in 640x480 and Audio is PCMU. I have to develop an app with Xamarin which Supports Android and later IOS,
one important Feature is that Video call among other Features.

I am thinking about trying Mediamuxer next but am already getting desperate...

Thanks for your help.
Coordinator
Apr 4, 2016 at 2:49 PM
Edited Apr 4, 2016 at 3:15 PM
Does your class above work with the data found Here?

Try feeding it manually data which you can expect to produce a result, e.g. you should be able to produce the sample picture using the data given in that thread with your decoder.

If you can do that then there should be no problem using data from any other source just as well.

I don't see how your Decode method is called so it's hard to say exactly what your doing there but that logic is already done in the RFC6184 Class which also contains more logic to properly add start codes.

If your decoder can handle NAL units directly then you should be able to give it the Payload of the RtpPacket which contains the Nals without even using the RFC6184 class.

This should be fairly easy to achieve, I think the problem is that your passing Nal units to a decoder which expects entire access units.

Let me know what I can do to help you!
Marked as answer by juliusfriedman on 4/4/2016 at 6:49 AM
Apr 4, 2016 at 2:57 PM
Edited Apr 4, 2016 at 2:58 PM
I feed it from here, it is just somewhere else so I am using Events to pull it through. It is the original Byte Array with playload data from here:

byte[] decodePack = new List<byte>(packet.PayloadData).ToArray();

        internal void OnSourceRtpPacketRecieved(object client, RtpPacket packet = null, RtpClient.TransportContext tc = null)
        {
            if (packet != null)
            {
                ClientLogger.Instance.Log("Got a RTP packet, SequenceNo = " + packet.SequenceNumber + " PayloadType = " + packet.PayloadType.ToString() + " Length = " + packet.Length);
                byte[] decodePack = new List<byte>(packet.PayloadData).ToArray();
                if (tc.MediaDescription.MediaType == MediaType.video)
                {
                    if (VideoDataReceived != null)
                        VideoDataReceived(this, new DataEventArgs() { Data = decodePack });
                }
                else if (tc.MediaDescription.MediaType == MediaType.audio && packet.Length > 100)
                {
                    if (AudioDataReceived != null)
                        AudioDataReceived(this, new DataEventArgs() { Data = decodePack });
                    //SaveAudioData(packet);
                }
            }
            else
            {
                ClientLogger.Instance.Log("NULL - RTP packet received");
            }
        }
Coordinator
Apr 4, 2016 at 3:20 PM
Right, does feeding the data for the sample IFrame in the linked thread work as expected on your decoder?

You have to determine what format the decoder expects, e.g. with start codes or without and then feed it the data correctly.

You are pretty much done here after you do that, typically the data must have a start code and that is achieved by using the RFC6184Frame class, it takes an existing RtpFrame and depacketizes each packet to a Buffer which can subsequently be fed to a decoder or copied to a file.

I would start by making a .264 file from the incoming frames using the RFC6184Frame class to depcketize the packets and then Append the 'Buffer' into .h264 file.

This file should be able to be played by VLC and FFMPEG or GSTREAMER.

If your decoder can't play it then there is something wrong with your decoder or you have to initialize it differently.

I think what your missing is the use of the RFC6184Frame class...
Marked as answer by juliusfriedman on 4/4/2016 at 7:20 AM
Apr 5, 2016 at 8:21 AM
Edited Apr 5, 2016 at 8:44 AM
Hello Julius,

your RFC6184Frame uses System Drawing and I think so does its parent.

So I would remove that and all drawing stuff for my purposes, try to open ProcessPacket with my received RTPPakets.

And after a few packets I can use the "Buffer" to continue with a file or a media Player, right?

I got some MicroSIP call working with Video. Interesting maybe what it's log said:

08:30:41.146 ffmpeg h264: no frame!
08:30:41.146 ffmpeg_vid_cod ffmpeg err -1094995529: Invalid data found when processing input
08:30:41.146 vstdec0325B2AC codec decode() error: Bad or corrupted bitstream (PJMEDIA_CODEC_EBADBITSTREAM) [err:220087]
08:30:41.240 vstdec0325B2AC ! Decoding format changed: 640x480 I420<- 90000/3070(~29)fps
08:30:41.240 sdl_dev.c Stopping sdl video stream
08:30:41.247 sdl_dev.c Starting sdl video stream
08:30:41.309 vstdec0325B2AC Decoding format changed: 640x480 I420<- 90000/3067(~29)fps
08:30:41.626 vstdec0325B2AC
Apr 5, 2016 at 12:19 PM
Hello Julius,

because your Class and DLL used System.Drawing which I cannot use like this in Xamarin, I first copied your RFC6184Frame class to my Decoder class.
Now your RTPClient OnSourceRtpPacketRecieved calls my "DecodePacket" each time with the RtpPacket packet.

I am not sure about your Parameters ignoreForbiddenZerobits (tried both) and fullStartCodes (tried always false and this one with trying to calculate them).
        RFC6184Media.RFC6184Frame frame;

        public void DecodePacket(Media.Rtp.RtpPacket packet)
        {
            if (frame == null)
            {
                frame = new RFC6184Media.RFC6184Frame((byte)packet.PayloadType);
                frame.Buffer = new System.IO.MemoryStream();
            }

            byte[] data = new List<byte>(packet.PayloadData).ToArray();
            int fragment_type = data[0] & 0x1F;
            int nal_type = data[1] & 0x1F;
            int start_bit = data[1] & 0x80;

            bool isIFrame = (((fragment_type == 28 || fragment_type == 29) && nal_type == 5 && start_bit == 128) || fragment_type == 5);

            frame.ProcessPacket(packet, false, !isIFrame); // todo startcodes ermitteln? test mit und ohne forbiddenflag
             Console.Out.WriteLine("video buffer length {0}", frame.Buffer.Length);

            Decode(frame.Buffer.ToArray(), isIFrame);
            frame.Buffer.Close();
            frame.Buffer = null;
            frame.Buffer = new System.IO.MemoryStream();
        }
The Decode () gives it to a simplified Method of my Decoder:
        public void Decode(byte[] data, bool isIFrame)
    {
            //int fragment_type = data[0] & 0x1F;
            //int nal_type = data[1] & 0x1F;
            //int start_bit = data[1] & 0x80;

            //bool isIFrame = (((fragment_type == 28 || fragment_type == 29) && nal_type == 5 && start_bit == 128) || fragment_type == 5);

            //Console.Out.WriteLine("fragment_type: {0} nal_type: {1} start_bit: {2} isIFrame: {3}", fragment_type, nal_type, start_bit, isIFrame);

            try {

                int inputBufferId = _codec.DequeueInputBuffer(-1);
                if (inputBufferId >= 0)
                {
                    //ByteBuffer inputBuffer = _codec.GetInputBuffer(inputBufferId);
                    //// fill inputBuffer with valid data
                    //inputBuffer.Clear();
                    ////inputBuffer.Put(data,0, data.Length);
                    //inputBuffer.Put(data);
                    MediaCodecBufferFlags flags = MediaCodecBufferFlags.None;
                    if (isIFrame)
                    {
                        flags = MediaCodecBufferFlags.KeyFrame; // SyncFrame?
                    }
                    else {
                    //    //if (!_isConfigured)
                        {
                            flags = MediaCodecBufferFlags.CodecConfig;
                    //      //  _isConfigured = true;
                        }
                    }
                    ////_codec.QueueInputBuffer(inputBufferId, 0, data.Length, 33, flags);
                    _codec.QueueInputBuffer(inputBufferId, 0, data.Length, 0, flags);
                    //_codec.QueueInputBuffer(inputBufferId, 0, data.Length, 0, 0);
                }


                MediaCodec.BufferInfo buffInfo = new MediaCodec.BufferInfo();
                int outputBufferId = _codec.DequeueOutputBuffer(buffInfo, timeout);
                if (outputBufferId == (int)MediaCodecInfoState.TryAgainLater)
                {
                    // no output available yet 
                    System.Diagnostics.Debug.WriteLine("no output from decoder available");
                    //if (VERBOSE) Log.d(TAG, "no output from decoder available"); 
                }
                else if (outputBufferId == (int)MediaCodecInfoState.OutputBuffersChanged)
                {
                    // The storage associated with the direct ByteBuffer may already be unmapped, 
                    // so attempting to access data through the old output buffer array could 
                    // lead to a native crash. 
                    System.Diagnostics.Debug.WriteLine("decoder output buffers changed");
                    //if (VERBOSE) Log.d(TAG, "decoder output buffers changed"); 
                    //decoderOutputBuffers = decoder.getOutputBuffers(); 
                }
                else if (outputBufferId == (int)MediaCodecInfoState.OutputFormatChanged)
                {
                    // this happens before the first frame is returned 
                    _format = _codec.GetOutputFormat(outputBufferId);
                    System.Diagnostics.Debug.WriteLine("decoder output format changed: " + _format);
                    //if (VERBOSE) Log.d(TAG, "decoder output format changed: " + decoderOutputFormat); 
                }
                else if (outputBufferId < 0)
                {
                    System.Diagnostics.Debug.WriteLine("unexpected result from deocder.dequeueOutputBuffer: " + outputBufferId);
                    //fail("unexpected result from deocder.dequeueOutputBuffer: " + decoderStatus); 
                }
                else
                {  // decoderStatus >= 0 
                    System.Diagnostics.Debug.WriteLine("surface decoder given buffer " + outputBufferId + " (size=" + buffInfo.Size + ")");
                    //if (VERBOSE) Log.d(TAG, "surface decoder given buffer " + decoderStatus + " (size=" + info.size + ")"); 
                    //rawSize += info.size; 
                    if ((buffInfo.Flags & MediaCodecBufferFlags.EndOfStream) != 0)
                    {
                        //if (VERBOSE) Log.d(TAG, "output EOS"); 
                        //outputDone = true; 
                    }

                    bool doRender = (buffInfo.Size != 0);

                    // As soon as we call releaseOutputBuffer, the buffer will be forwarded 
                    // to SurfaceTexture to convert to a texture.  The API doesn't guarantee 
                    // that the texture will be available before the call returns, so we 
                    // need to wait for the onFrameAvailable callback to fire. 
                    _codec.ReleaseOutputBuffer(outputBufferId, doRender);
                    if (doRender)
                    {
                        System.Diagnostics.Debug.WriteLine("awaiting frame ");
                        //if (VERBOSE) Log.d(TAG, "awaiting frame " + checkIndex); 
                        //assertEquals("Wrong time stamp", computePresentationTime(checkIndex), info.presentationTimeUs); 
                        //_surface.awaitNewImage();
                        //_surface.drawImage();
                        _surface.Show();
                        //if (!checkSurfaceFrame(checkIndex++)) { 
                        //  badFrames++; 
                        //} 
                    }
                }
            }catch(Exception ex)
            {

            }
    }
Is this looking okay so far or any ideao what's wrong now? Again It always runs into
System.Diagnostics.Debug.WriteLine("no output from decoder available");


I'll try a file now, too.
Coordinator
Apr 5, 2016 at 12:50 PM
Edited Apr 5, 2016 at 12:52 PM
Yes your correct, I will be moving the frame class soon enough so it's easier to access but your correct in how you extracted it from the RtspServer in this case.

Yes, it seems your not calling Depacketize at all.

Did you see TestRFC6184VideoFrame in the UnitTest project?

It shows how to call Depacketize and access the Buffer with the Nal Units which will already be prefixed with Start codes.

You only need to add the sps and pps from the sdp if it's in the sdp, in this case you can skip that step and just pass the Buffer which is available after calling Depacketize.

That should work fine.
Marked as answer by juliusfriedman on 4/5/2016 at 4:52 AM
Apr 5, 2016 at 3:36 PM
Hello Julius,

thanks a lot for that hint. I took a look at it and changed my DecodePacket like this:
(I know the "frame" does not make sense anymore. I'll fix it later. Just a hint: PlayloadType is sometimes a Byte, sometimes a int),

Still your RTPClient OnSourceRtpPacketRecieved calls my "DecodePacket" each time with the RtpPacket packet.
        public void DecodePacket(Media.Rtp.RtpPacket packet)
        {
            try
            {
                if (frame == null)
                {
                    frame = new RFC6184Media.RFC6184Frame((byte)packet.PayloadType);
                    frame.Buffer = new System.IO.MemoryStream();
                    string fileName = System.IO.Path.Combine(System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal), "ownVideo.mkv");
                   if (System.IO.File.Exists(fileName))
                        System.IO.File.Delete(fileName);
                    fileStream = new System.IO.FileStream(fileName, System.IO.FileMode.Create, System.IO.FileAccess.Write);                   
                }
                if (memoryStream == null)
                    memoryStream = new System.IO.MemoryStream();

                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(99))
                {
                    //Create the managed packet from that binary data and add it to the frame.
                    profileFrame.Add(packet);

                    //Remove any profile header and create a stream which can be given to a decoder.
                    profileFrame.Depacketize();

                    //If there is no buffer then there is nothing to process.
                    if (profileFrame.Buffer == null) return;

                    //If there is not a sps or pps in band and this is the first frame given to a decoder then it needs to contain a SPS and PPS
                    //This is typically retrieved from the SessionDescription or CodecPrivateData but only the very first time.
                    if (!m_InitializedStream && false == profileFrame.ContainsSequenceParameterSet || false == profileFrame.ContainsPictureParameterSet)
                    {
                        //From the MediaDescription.FmtpLine from the SessionDescription which describes the media.
                        //Media.Sdp.SessionDescriptionLine fmtp = new Sdp.SessionDescriptionLine("a=fmtp:97 packetization-mode=1;profile-level-id=42C01E;sprop-parameter-sets=Z0LAHtkDxWhAAAADAEAAAAwDxYuS,aMuMsg==");

                        //We will extract the sps and pps from that line.
                        byte[] sps = null, pps = null;

                        //If there was a fmtp line then iterate the parts contained.
                        //if (fmtp != null) foreach (string p in fmtp.Parts)
                        //    {
                        //        string trim = p.Trim();
                        //        if (trim.StartsWith("sprop-parameter-sets=", StringComparison.InvariantCultureIgnoreCase))
                        //        {
                        //            string[] data = trim.Replace("sprop-parameter-sets=", string.Empty).Split(',');
                        //            sps = System.Convert.FromBase64String(data[0]);
                        //            pps = System.Convert.FromBase64String(data[1]);
                        //            break;
                        //        }
                        //    }

                        sps = System.Convert.FromBase64String("42401fa680b41264");
                        sps = System.Convert.FromBase64String("ce30a480");

                        //Prepend the SPS if it was found
                        if (sps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fileStream.WriteByte(0);
                            memoryStream.WriteByte(0);

                            //Write the start code
                            fileStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);
                            memoryStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);

                            //Write the SPS
                            fileStream.Write(sps, 0, sps.Length);
                            memoryStream.Write(sps, 0, sps.Length);
                        }

                        //Prepend the PPS if it was found.
                        if (pps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fileStream.WriteByte(0);
                            memoryStream.WriteByte(0);

                            //Write the start code
                            fileStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);
                            memoryStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);

                            //Write the PPS
                            fileStream.Write(pps, 0, pps.Length);
                            memoryStream.Write(pps, 0, pps.Length);
                        }

                        //Don't do this again...
                        //m_InitializedStream = true;
                    }

                    //Write the data in the frame.
                    profileFrame.Buffer.CopyTo(fileStream);
                    profileFrame.Buffer.CopyTo(memoryStream);

                    Decode(memoryStream.ToArray(), false);
                    memoryStream.Close();
                    memoryStream = null;
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }               
        }
Is the way I call my Decode() at the right place and correct this way? I still run into that System.Diagnostics.Debug.WriteLine("no output from decoder available");
but I hope I am getting closer :-)

I have a hard coded SPS and PPS, is it okay to add them one time only this way?

Thanks a lot again!
Coordinator
Apr 5, 2016 at 5:03 PM
It's fine to add the sps and pps that way if you need to, you could also just add a line to the media description.

Finally you are using packets when you should be using frames as some packets could contain only part of a single NAL e,g, fragmented,

There is the FrameChanged event which provides frames for this purpose.

If your are sure there is no fragmented NAL units then this is fine I guess.

Remember that if your using m_InitializedStream to set it to true after you write your sps and pps.
Marked as answer by juliusfriedman on 4/5/2016 at 9:03 AM
Apr 6, 2016 at 7:21 AM
Edited Apr 6, 2016 at 7:24 AM
Hello Julius,

yes the Video packets are only 1400 Bytes each, in this case no Frame, it is a packet, very good Point!

Can I do the above taken from your unit test, but only with a Frame?

Now there is no Frame.Buffer (do I Need it?) and no profileFrame.Add(frame); anymore...

Use using (RFC6184Media.RFC6184Frame profileFrame = frame) gets me a type Problem...

I am working on this, currently like this one, got that two Problems:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                if (frame == null)
                {
                    //frame = new Media.Rtsp.Server.MediaTypes.RFC6184Media.RFC6184Frame(0);
                    frame = new RFC6184Media.RFC6184Frame(frame.PayloadTypeByte);
                    frame.Buffer = new System.IO.MemoryStream();
                    string fileName = System.IO.Path.Combine(System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal), "ownVideo.mkv");
                    //string fileName = System.IO.Path.Combine(System.Environment.GetFolderPath(System.Environment.SpecialFolder.MyVideos), "ownVideo.mkv");
                    //string fileName = System.IO.Path.Combine(System.Environment.GetFolderPath(System.Environment.SpecialFolder.CommonVideos), "ownVideo.mkv");
                    if (System.IO.File.Exists(fileName))
                        System.IO.File.Delete(fileName);
                    fileStream = new System.IO.FileStream(fileName, System.IO.FileMode.Create, System.IO.FileAccess.Write);                   
                }
                if (memoryStream == null)
                    memoryStream = new System.IO.MemoryStream();

                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(99))
                {
                    //Create the managed packet from that binary data and add it to the frame.
                    profileFrame.Add(frame);

                    //Remove any profile header and create a stream which can be given to a decoder.
                    profileFrame.Depacketize();

                    //If there is no buffer then there is nothing to process.
                    if (profileFrame.Buffer == null) return;

                    //If there is not a sps or pps in band and this is the first frame given to a decoder then it needs to contain a SPS and PPS
                    //This is typically retrieved from the SessionDescription or CodecPrivateData but only the very first time.
                    if (!m_InitializedStream && false == profileFrame.ContainsSequenceParameterSet || false == profileFrame.ContainsPictureParameterSet)
                    {
                        //From the MediaDescription.FmtpLine from the SessionDescription which describes the media.
                        //Media.Sdp.SessionDescriptionLine fmtp = new Sdp.SessionDescriptionLine("a=fmtp:97 packetization-mode=1;profile-level-id=42C01E;sprop-parameter-sets=Z0LAHtkDxWhAAAADAEAAAAwDxYuS,aMuMsg==");

                        //We will extract the sps and pps from that line.
                        byte[] sps = null, pps = null;

                        //If there was a fmtp line then iterate the parts contained.
                        //if (fmtp != null) foreach (string p in fmtp.Parts)
                        //    {
                        //        string trim = p.Trim();
                        //        if (trim.StartsWith("sprop-parameter-sets=", StringComparison.InvariantCultureIgnoreCase))
                        //        {
                        //            string[] data = trim.Replace("sprop-parameter-sets=", string.Empty).Split(',');
                        //            sps = System.Convert.FromBase64String(data[0]);
                        //            pps = System.Convert.FromBase64String(data[1]);
                        //            break;
                        //        }
                        //    }

                        sps = System.Convert.FromBase64String("42401fa680b41264");
                        sps = System.Convert.FromBase64String("ce30a480");
                        m_InitializedStream = true;

                        //Prepend the SPS if it was found
                        if (sps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fileStream.WriteByte(0);
                            memoryStream.WriteByte(0);

                            //Write the start code
                            fileStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);
                            memoryStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);

                            //Write the SPS
                            fileStream.Write(sps, 0, sps.Length);
                            memoryStream.Write(sps, 0, sps.Length);
                        }

                        //Prepend the PPS if it was found.
                        if (pps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fileStream.WriteByte(0);
                            memoryStream.WriteByte(0);

                            //Write the start code
                            fileStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);
                            memoryStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);

                            //Write the PPS
                            fileStream.Write(pps, 0, pps.Length);
                            memoryStream.Write(pps, 0, pps.Length);
                        }

                        //Don't do this again...
                        //m_InitializedStream = true;
                    }

                    //Write the data in the frame.
                    profileFrame.Buffer.CopyTo(fileStream);
                    profileFrame.Buffer.CopyTo(memoryStream);

                    Decode(memoryStream.ToArray(), false);
                    memoryStream.Close();
                    memoryStream = null;
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }               
        }
Apr 6, 2016 at 7:33 AM
Edited Apr 6, 2016 at 7:40 AM
I'll try

using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))

and of courde replaced Frame == null with !DecoderIsInitalized and set it to true for creating filestream etc.
Apr 6, 2016 at 8:07 AM
Edited Apr 6, 2016 at 9:21 AM
Hi Julius,

here the new summary. This makes sense to me, but still runs me into the System.Diagnostics.Debug.WriteLine("no output from decoder available");

First I use OnSourceFrameChanged now instead of the packet, right?
        internal void OnSourceFrameChanged(object sender, RtpFrame frame = null, RtpClient.TransportContext tc = null, bool final = false)
        {            
            if (frame != null)
            {
                ClientLogger.Instance.Log("Got a RTP Frame PacketCount = " + frame.Count + " Complete = " + frame.IsComplete);
                if (tc.MediaDescription.MediaType == MediaType.video)
                {
                    if (VideoFrameReceived != null)
                        VideoFrameReceived(this, frame);
                }
            }
            else
            {
                ClientLogger.Instance.Log("NULL - RTP frame received");
            }
        }
With Events that gets me to my new function based on your unit test:
        RFC6184Media.RFC6184Frame frame;

        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                bool isConfigFrame = false;
                if (!DecoderIsInitalized)
                {
                    DecoderIsInitalized = true;
                    //frame = new RFC6184Media.RFC6184Frame(frame.PayloadTypeByte);
                    //frame.Buffer = new System.IO.MemoryStream();
                    string fileName = System.IO.Path.Combine(System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal), "ownVideo.mkv");
                   if (System.IO.File.Exists(fileName))
                        System.IO.File.Delete(fileName);
                    fileStream = new System.IO.FileStream(fileName, System.IO.FileMode.Create, System.IO.FileAccess.Write);                   
                }
                if (memoryStream == null)
                    memoryStream = new System.IO.MemoryStream();

               using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    //Create the managed packet from that binary data and add it to the frame.
                    //profileFrame.Add(frame);

                    //Remove any profile header and create a stream which can be given to a decoder.
                    profileFrame.Depacketize();

                    //If there is no buffer then there is nothing to process.
                    if (profileFrame.Buffer == null) return;

                    //If there is not a sps or pps in band and this is the first frame given to a decoder then it needs to contain a SPS and PPS
                    //This is typically retrieved from the SessionDescription or CodecPrivateData but only the very first time.
                    if (!m_InitializedStream && false == profileFrame.ContainsSequenceParameterSet || false == profileFrame.ContainsPictureParameterSet)
                    {
                        //From the MediaDescription.FmtpLine from the SessionDescription which describes the media.
                        //Media.Sdp.SessionDescriptionLine fmtp = new Sdp.SessionDescriptionLine("a=fmtp:97 packetization-mode=1;profile-level-id=42C01E;sprop-parameter-sets=Z0LAHtkDxWhAAAADAEAAAAwDxYuS,aMuMsg==");

                        //We will extract the sps and pps from that line.
                        byte[] sps = null, pps = null;

                        sps = System.Convert.FromBase64String("42401fa680b41264");
                        pps = System.Convert.FromBase64String("ce30a480");
                        m_InitializedStream = true;
                        isConfigFrame = true;

                        //Prepend the SPS if it was found
                        if (sps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fileStream.WriteByte(0);
                            memoryStream.WriteByte(0);

                            //Write the start code
                            fileStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);
                            memoryStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);

                            //Write the SPS
                            fileStream.Write(sps, 0, sps.Length);
                            memoryStream.Write(sps, 0, sps.Length);
                        }

                        //Prepend the PPS if it was found.
                        if (pps != null)
                        {
                            //Emulation prevention, present for SPS or PPS
                            fileStream.WriteByte(0);
                            memoryStream.WriteByte(0);

                            //Write the start code
                            fileStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);
                            memoryStream.Write(Media.Codecs.Video.H264.NalUnitType.StartCode, 0, 3);

                            //Write the PPS
                            fileStream.Write(pps, 0, pps.Length);
                            memoryStream.Write(pps, 0, pps.Length);
                        }

                   }

                    //Write the data in the frame.
                    profileFrame.Buffer.CopyTo(fileStream);
                    profileFrame.Buffer.CopyTo(memoryStream);

                    Decode(memoryStream.ToArray(), isConfigFrame);
                    memoryStream.Close();
                    memoryStream = null;
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }               
        }
Is this way to use Decode (a few lines above the end) okay and then Close the memoorystream and set it to null again correct?

My decode still Looks like like above , is making the sps pps Frame a CodeConfig Frame correct?)

And the sps pps is added often it seems, is it enough and okay to only do this one time?
if (!m_InitializedStream)// && (false == profileFrame.ContainsSequenceParameterSet || false == profileFrame.ContainsPictureParameterSet))

Thanks :-)


PS: Something is wrong, the profileFrame.Buffer seems to be empty.

ClientLogger.Instance.Log("Got a RTP Frame PacketCount = " + frame.Count + " Complete = " + frame.IsComplete);
tells me 04-06 10:16:23.200 I/mono-stdout(16104): Got a RTP Frame PacketCount = 1 Complete = False

maybe this
using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
is wrong...
Apr 6, 2016 at 9:33 AM
Edited Apr 6, 2016 at 10:22 AM
I added
System.Diagnostics.Debug.WriteLine("frame to decoder, length: "+ data.Length);
to the very beginning of my Decode Method, first time it gives me 26 (the SPS PPS) after that is sometimes 0 and has various size from 1390-7805 or something like that.

I also removed all the fileStream stuff now.

If this helps:

in Processpacket I get here

//Write the data of the fragment.
Buffer.Write(packetData, offset, fragment_size);

With values packetData Byte[1388] (could be right with my 1400 Byte large packets?), Offset for example 0x0005 here, fragement size is 0x0567 here. Is this too much?


And in Depacketize I have sometimes 1 packet this way, up to 12.

Is it correct to use Depacketize still fro the Frame or should I fill the buffer in a different way?
Apr 6, 2016 at 2:25 PM
And as an alternative I added this when I create my Decoder:
            byte[] sps = null, pps = null;
            sps = StringToByteArray("42401ea680a03d90"); // laut Herrn Weber StringToByteArray("42401fa680b41264");
            pps = StringToByteArray("ce30a480");
            _format.SetByteBuffer("csd-0", ByteBuffer.Wrap(sps));
            _format.SetByteBuffer("csd-1", ByteBuffer.Wrap(pps));
            m_InitializedStream = true; // todo nur wenn hier schon sps pps gesetzt werden
And so in the stream itself SPS and PPS is not added anymore. but still it does not help.
Apr 6, 2016 at 3:26 PM

It works!

Thanks a lot again and again for helping me with this one. The Picture is still delayed and has glitches, but it works.

I will ask my Company next couple of days to try and use your sip instead the current wihich is still in Evaluation.
And if not I will spent you a few bucks for drinking a beer on that success.

Thanks Julius, that was a great Support!
Coordinator
Apr 7, 2016 at 7:23 PM
Not a problem, don't forget that FrameChagned may fire multiple times. You will want to keep an eye on 'final' to see if that is the last time or not.

After that you may also want to check for 'IsComplete' on the frame to ensure the markers are present.

If you access packets, keep in mind that Payload has the Extensions, Csrc's and Padding, you probably want the 'PayloadData' instead other wise use 'Array' of the 'Payload' and remember to calculate the offsets correctly.

I am working on a few updates to the RtpFrame class so that it can offer more potential, I am also considering adding a JitterBuffer.

Thanks for your comments and let me know if you need anything else!
Marked as answer by juliusfriedman on 4/7/2016 at 11:23 AM
Apr 8, 2016 at 11:17 AM
Edited Apr 8, 2016 at 1:29 PM
Hello Julius,

I simpliefied my DecodePacket a lot now. I am not working with an own memorystream anymore, I am using the profileFrame.Buffer directly.
when use Depacketize and when Assemble? Depacketize for H264 Codec and Assemble for non-encoded Audio?
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    //Remove any profile header and create a stream which can be given to a decoder.
                    profileFrame.Assemble();//.Depacketize();

                    //If there is no buffer then there is nothing to process.
                    if (profileFrame.Buffer == null || profileFrame.Buffer.Length == 0) return;

                    //Write the data in the frame.
                    Decode(profileFrame.Buffer.ToArray(), profileFrame.IsComplete);
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }
        }
In the Mediacodec Class of Android you can set flags for each packet I add:

MediaCodecBufferFlags.None
CodecConfig: buffer contains codec initialization / configuration instead of media data
EndofStream
KeyFrame
SyncFrame: buffer contains data for a sync Frame

So is your IsComplete flag the KeyFrame (one complete Picture if I am right)?

Your final flag (from the Framechanged Event) seems not to be EndofStream like I thought, it is sometimes true sometimes false. So this indicates a key Frame? What's IsComplete then?
It is also true but not that often. Maybe it would make sense to be in the RTPFrame class too if it indicates a keyframe?

Thanks :-)
Coordinator
Apr 8, 2016 at 2:01 PM
IsComplete is pretty much the same as it is in the RFC6184 class as it is in the RtpFrame, I am working on possibly changing it to include some additional logic for derived typed but in general it means that the frame is not missing packets and has a marker packet.

Assemble is used on the RtpFrame type directly for obtaining a set of bytes which can be added to a memory stream, I am improving this to expose the Segments directly so that copying doesn't require ToArray externally.

You can use it for types which are implicitly defined to have a depacketization process which doesn't involve fragmentation or further re-ordering e.g. PCM.

'final' I agree is possibly better used on the frame but I have to design this in, originally I planned to use ShouldDispose but none the less, what it means is that the frame in this event is going to be 'Disposed' right afterwards, thus you can skip processing frames until 'final' is true but you add a small amount of latency.

Then you can further allow or deny frames based on if you want to allow frames with missing packets or without a marker based on the profile etc.

I am also revising the RtpFrame class for other improvements as well such as the ability to customize depacketization via the 'RtpProfile' / 'RtpProfileInformation' but only the RtpFrame is close to being done at this point.

There are also possibly plans for a JitterBuffer.

Finally, 'KeyFrame' can be determined by using the 'ContainsInstantaneousDecoderRefresh' refresh property on the RFC6184Frame.

If you need anything else let me know!
Marked as answer by juliusfriedman on 4/8/2016 at 6:01 AM
Apr 11, 2016 at 9:21 AM
Hello Julius,

thanks for the hints and the update. Again I copied your RFC6184Media Class to my Project because I cannot use the System.Drawing class in RTSPServer.
I had to remove the base class, some packetize stuff and the Start after that but I think I do not Need those.

So is it correct to use Assemble this way? I simplified much more now and this method is called from your RTPClient OnSourceFrameChanged:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    //Create the managed packet from that binary data and add it to the frame.
                    //profileFrame.Add(frame);

                    //Remove any profile header and create a stream which can be given to a decoder.
                    //profileFrame.Assemble();//
                    //profileFrame.Depacketize();

                    //If there is no buffer then there is nothing to process.
                    //if (profileFrame.Buffer == null || profileFrame.Buffer.Length == 0) return;

                   Decode(profileFrame.Assemble().ToArray(), profileFrame.ContainsInstantaneousDecoderRefresh);
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }
        }
It still has a delay and heavy artifacts / glitches especially when something moves.
I am using ContainsInstantaneousDecoderRefresh for Setting keyframe flag, but does not Change something.
I tried both calling my method every time for OnSourceFrameChanged with Video, and I tried calling in only when final is true there, I also see no big Change.

My Decode method Looks like this, with a _codec MediaCodec on Android start as "Video/avc" for H264, the correct Width and Height and SPS and PPS (I believe)
        public void Decode(byte[] data, bool isComplete)
        {
            try
            {
                MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
                int inputBufferIndex = _codec.DequeueInputBuffer(-1); //kann auch timeout sein, timeout sollte nicht bei input und output verwendet werden
                if (Globals.DoDetailedLog)
                    System.Diagnostics.Debug.WriteLine("inputBufferIndex: " + inputBufferIndex);
                if (inputBufferIndex >= 0)
                {
                    MediaCodecBufferFlags flags = MediaCodecBufferFlags.None;
                    if (isComplete)
                    {
                        flags = MediaCodecBufferFlags.KeyFrame;
                    }
                    else {
                        flags = MediaCodecBufferFlags.None; //todo Config Flag too?
                    }

                    //ByteBuffer buffer = mDecodeInputBuffers[inputBufferIndex];
                    //buffer.Clear();
                    //buffer.Put(data);
                    //_codec.QueueInputBuffer(inputBufferIndex,0, data.Length, 0,  flags);
                    //if (isComplete)
                    //    _codec.Flush();
                    _codec.QueueInputBuffer(inputBufferIndex, 0, data.Length, 0, flags);
                }

                int outputBufferIndex = _codec.DequeueOutputBuffer(bufferInfo, timeout);
               do
                {

                    if (outputBufferIndex == (int)MediaCodecInfoState.TryAgainLater) //-1
                    {
                        //no output available yet
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("no output from decoder available, waiting for correct frame");
                    }
                    else if (outputBufferIndex == (int)MediaCodecInfoState.OutputBuffersChanged) // -3
                    {
                        //_codec.GetOutputBuffers();
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("decoder output buffers changed");
                    }
                    else if (outputBufferIndex == (int)MediaCodecInfoState.OutputFormatChanged) // -2
                    {
                        MediaFormat formatNew = _codec.OutputFormat;
                        _codec.Configure(formatNew, _surface, null, 0);
                        //mediaformat changed
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("decoder mediaformat changed");
                    }
                    else if (outputBufferIndex < 0)
                    {
                        //unexpected result from encoder.dequeueOutputBuffer
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("unexpected result from deocder.dequeueOutputBuffer: " + outputBufferIndex);
                    }
                    else
                    {
                        bool doRender = bufferInfo.Size > 0; //(bufferInfo.Size != 0)
                        _codec.ReleaseOutputBuffer(outputBufferIndex, doRender);
                        outputBufferIndex = _codec.DequeueOutputBuffer(bufferInfo, 0);

                   }
                } while (outputBufferIndex > 0);
            }
            catch (Exception ex)
            {
                ClientLogger.Instance.Log(string.Format("Exception occured in H264Decoder Decode: {0}", ex.ToString()));
            }
        }
I'll try to find out which causes my huge artifacts...

Thanks for your help!
Coordinator
Apr 11, 2016 at 3:21 PM
Edited Apr 11, 2016 at 3:21 PM
It looks fine.

Try only calling it when 'final' is false for now, that will add a little latency but ensure the most data is available.

I will be working more on the RtpFrame class and derived implementations this week.
Marked as answer by juliusfriedman on 4/11/2016 at 7:21 AM
Coordinator
Apr 11, 2016 at 8:13 PM
Sorry, also there is a IsSlice method and IsIntra method in the H.264 class, it may help you detecting Key frames.

111976 adds some additional logic for this and supports parallel processing of packets so you can call Depacketize much more often and not have adverse effects.

Please definitely give it a look over and let me know what you think!
Marked as answer by juliusfriedman on 4/11/2016 at 12:13 PM
Apr 11, 2016 at 9:16 PM
Hi Julius,

it is 22:15 here, but I want to have a look tomorrow in the morning. Thanks a lot for your new Version :-)

Should I use OnSourceFrameChanged or Packet Received for Video? And only call my Decoder when final is false?
And then use Depacketize for Packets? In the Frame for each packet or the single packets received?

Thanks, I am anxious to see if it works out better :-)
Coordinator
Apr 11, 2016 at 9:20 PM
It really depends on what your trying to do...

I would say that overall everything should be frame based and packet events should only be used as a last resort.

You can call your decoder anytime you want but in general it should be called when there are no gaps in the received packets and the marker is present.

No problem and let me know how it works out.
Marked as answer by juliusfriedman on 4/11/2016 at 1:20 PM
Apr 11, 2016 at 10:32 PM
Can I use Depacketize with the Frame from OnSourceFrameChanged?
Coordinator
Apr 11, 2016 at 11:06 PM
Edited Apr 11, 2016 at 11:06 PM
Yes, where I am going you should be able to call Depacketize multiple times and determine any change via the Depacketized.Count.

There is currently HasDepacketized which can tell you after calling Depacketize if there is anything worth reading the 'Buffer' for.

The idea is call Depacketize and then check HasDepacketized and access the Buffer property if it was true.

The only hardship is if you access the Buffer in-between adding packets and the packets arrive out of order, then the buffer cannot reliably be augmented with only the new data so the buffer has to be destroyed again and created with the data in order making the 'Buffer' property itself hard to maintain...

If you don't access Buffer you won't have that issue, and you really should only create the Buffer once 'IsComplete' is true for your frame.

If you want to take an attempt to decode partial frames or otherwise you can still access the data, even if it arrives out of order using the 'Depacketized' list on the RtpFrame.

This will prevent the MemoryStream from being created until it needs to be or until you want it, it also saves having to dispose and re-create a memory to access only data from one new packet.

Keep in mind all the data you need should reside in the 'Depacketized' list of a RtpFrame and can be copied to a decoder's buffer or simply given to a decoder in place using that list.

I am doing my best to consolidate this logic and add additional interfaces which make this all done with no virtual call overhead, any feedback you and anyone else can give me will on the API will help me design it better.

I have a lot of notes which attempt to explain everything, if something doesn't make sense ask me as that will also indicate my design is not easy to understand.

Thanks again for your testing!

P.s. more updates tomorrow!
Marked as answer by juliusfriedman on 4/11/2016 at 3:06 PM
Apr 12, 2016 at 8:54 AM
Edited Apr 12, 2016 at 9:19 AM
Hi Julius,

because of that System.Drawing Problem again I copied your RFC6184Media to my Project, instead of deriving from ist Basic media class I changed to

RFC6184Media: RtpFrame //Todo use RtpSink not RFC2435Media

to have the Basic Features I Need. Then I set Depacketized to Public:

public readonly SortedList<int, Common.MemorySegment> Depacketized;

and deleted the Start and the Packetize and the EncodeMacroblock from RFC6184Media so I am able to build. I think I don't Need those for Decoding (?).

your RTPClient OnSourceFrameChanged calls my DecodeFrame:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized && profileFrame.HasBuffer)
                    {
                        Decode(profileFrame.Buffer.ToArray(), profileFrame.IsComplete);
                    }    
That creates the following error:

04-12 09:47:39.815 I/ExtendedCodec( 8185): Decoder will be in frame by frame mode
04-12 09:47:39.837 D/SurfaceUtils( 8185): set up nativeWindow 0xba7989e8 for 640x480, color 0x7fa30c04, rotation 0, usage 0x42002900
Exception in DecodePacket: System.NullReferenceException: Object reference not set to an instance of an object
04-12 09:47:39.933 I/mono-stdout( 8185): Exception in DecodePacket: System.NullReferenceException: Object reference not set to an instance of an object
at sks_Client.Droid.RFC6184Media+RFC6184Frame.ProcessPacket (Media.Rtp.RtpPacket packet, Boolean ignoreForbiddenZeroBit, Boolean fullStartCodes) [0x00026] in C:\Projekte\sks-Client for Gateway 2.0\sks-Client for Gateway 2.0\sks_Client_for_Gateway_2._0.Droid\RFC6184Media.cs:397
at sks_Client.Droid.RFC6184Media+RFC6184Frame.Depacketize (Media.Rtp.RtpPacket packet) [0x00001] in C:\Projekte\sks-Client for Gateway 2.0\sks-Client for Gateway 2.0\sks_Client_for_Gateway_2._0.Droid\RFC6184Media.cs:372
at Media.Rtp.RtpFrame.Depacketize (Boolean allowIncomplete) [0x00022] in c:\Projekte
et7mma-111977\Rtp\RtpFrame.cs:950
at Media.Rtp.RtpFrame.Depacketize () [0x00000] in c:\Projekte
et7mma-111977\Rtp\RtpFrame.cs:937

Maybe some Initialization is missing this way.
                //From the beginning of the data in the actual payload
                int payloadOffset = packet.Payload.Offset, 
                    nonPayloadOctets = packet.HeaderOctets,
                    offset = payloadOffset + nonPayloadOctets,
                    count = packet.Payload.Count - (nonPayloadOctets + packet.PaddingOctets); //until the end of the actual payload
Seems to Crash. packet.Playload seems to be null sometimes (not always).
Apr 12, 2016 at 9:24 AM
Edited Apr 12, 2016 at 9:37 AM
I inserted in RFC6184Media
                if (packet.Payload == null)
                    return;

                //From the beginning of the data in the actual payload
and then I needed to Change to remove HasBuffer in my part seems to be false always.
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized)
                    {
                        Decode(profileFrame.Buffer.ToArray(), profileFrame.IsComplete);
                    }     
I also set profileFrame.IsKeyFrame() instead of profileFrame.IsComplete here but see no difference for now.

Is it correct to create a new profileFrame from the given Frame each time or do I Need to add the current Frame somehow before using Depacketize?

At least I get a visible Video again and will now try to enhance Performance. Thanks!
Apr 12, 2016 at 10:17 AM
Edited Apr 12, 2016 at 2:25 PM
I am unsure on how to use

profileFrame.IsKeyFrame(), profileFrame.IsComplete, profileFrame.ContainsInstantaneousDecoderRefresh

I currenty use

profileFrame.IsKeyFrame() to tell the codec it is a Keyframe (or SynFrame acording to Xamarin Docu both is the same),

profileFrame.IsComplete to tell the codec to flush (?) is that EndOfStream?

and do
profileFrame.ContainsInstantaneousDecoderRefresh and the SPS, PPS, Slice tell the codec it is a Config flag?

receive the Frame
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized)
                    {
                        Decode(profileFrame.Buffer.ToArray(), profileFrame.IsKeyFrame(), profileFrame.IsComplete, profileFrame.ContainsPictureParameterSet || profileFrame.ContainsSequenceParameterSet);                        
                    }                    
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }
        }
Decode the Frame with Android mediacodec:
        public void Decode(byte[] data, bool isKeyframe, bool isComplete, bool isCodecConfig)
        {
            try
            {
                MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
                int inputBufferIndex = _codec.DequeueInputBuffer(-1); //kann auch timeout sein, timeout sollte nicht bei input und output verwendet werden
                if (Globals.DoDetailedLog)
                    System.Diagnostics.Debug.WriteLine("inputBufferIndex: " + inputBufferIndex);
                if (inputBufferIndex >= 0)
                {
                    MediaCodecBufferFlags flags = MediaCodecBufferFlags.None;
                    if (isComplete)
                    {
                        flags = MediaCodecBufferFlags.EndOfStream; 
                    }
                    else if (isKeyframe)
                    {
                        flags = MediaCodecBufferFlags.KeyFrame; //Definition bei Xamarin für KeyFrame und SyncFrame ist gleich: This indicates that the (encoded) buffer marked as such contains the data for a key frame
                    }
                    else if (isCodecConfig)
                    {
                        flags = MediaCodecBufferFlags.CodecConfig; 
                    }
                    else
                    {
                        flags = MediaCodecBufferFlags.None; 
                    }

                    _decodeInputBuffer = _codec.GetInputBuffer(inputBufferIndex);
                    _decodeInputBuffer.Clear();
                    _decodeInputBuffer.Put(data);
                    _codec.QueueInputBuffer(inputBufferIndex, 0, data.Length, 0, flags);

                    // Flush am Ende aufrufen -> EndOfstream
                    if (isComplete)
                        _codec.Flush();                    
                }

                int outputBufferIndex = _codec.DequeueOutputBuffer(bufferInfo, timeout);
                if (Globals.DoDetailedLog)
                    System.Diagnostics.Debug.WriteLine("outputBufferIndex: " + outputBufferIndex);
                do
                {

                    if (outputBufferIndex == (int)MediaCodecInfoState.TryAgainLater) //-1
                    {
                        //no output available yet
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("no output from decoder available, waiting for correct frame");
                    }
                    else if (outputBufferIndex == (int)MediaCodecInfoState.OutputBuffersChanged) // -3
                    {
                        //_codec.GetOutputBuffers();
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("decoder output buffers changed");
                    }
                    else if (outputBufferIndex == (int)MediaCodecInfoState.OutputFormatChanged) // -2
                    {
                        MediaFormat formatNew = _codec.OutputFormat;
                        _codec.Configure(formatNew, _surface, null, 0);
                        //mediaformat changed
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("decoder mediaformat changed");
                    }
                    else if (outputBufferIndex < 0)
                    {
                        //unexpected result from encoder.dequeueOutputBuffer
                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("unexpected result from deocder.dequeueOutputBuffer: " + outputBufferIndex);
                    }
                    else
                    {
                        bool doRender = bufferInfo.Size > 0;
                        _codec.ReleaseOutputBuffer(outputBufferIndex, doRender);
                        outputBufferIndex = _codec.DequeueOutputBuffer(bufferInfo, timeout);

                        if (Globals.DoDetailedLog)
                            System.Diagnostics.Debug.WriteLine("inner outputBufferIndex: " + outputBufferIndex);
                    }
                } while (outputBufferIndex > 0);
            }
            catch (Exception ex)
            {
                ClientLogger.Instance.Log(string.Format("Exception occured in H264Decoder Decode: {0}", ex.ToString()));
            }
        }
This works, has a delay and heavy artifacts / glitches, which remain for a while.

Xamarin tells about the flags:

EndOfStream
This signals the end of stream, i.e. no buffers will be available after this, unless of course, MediaCodec.Flush follows

Both KeyFrame & SyncFrame
This indicates that the (encoded) buffer marked as such contains the data for a key frame
Coordinator
Apr 12, 2016 at 5:06 PM
Thank you for your questions and the examples you posted.

You do not need RFC6184Media for decoding, you only need the frame class, in the future the Frame classes will be moved to either the Rtp assembly or another one which can encapsulate their functionality and will be used by the RtspServer when required.

Because Decode takes byte[] data you can probably provide it just a single NAL at a time unless I am missing something... If it needs to receive a single Access Unit then you will have to wait for AccessUnitDelemiter to be received or the 'Marker' bit to pass the data to decode.

Anyway more importantly....

In Decode you can Queue an input buffer, these buffers are available from the RtpFrame directly..

After you call Depacketize there is a list of the arrays of data (sorted by decoding order number if needed)

These arrays are encapsulated by MemorySegment which provides the array, source offset and length of each piece of memory which is needed by the decoder and typically created during depacketization.

You can check this with HasDepacketized.

When you want to use the data without creating a MemoryStream or copying it you can simply just pass these buffers to the decoder, this is the most optimal as it doesn't require any copying of data.

In the future I will probably have some type of Stream encapsulation around MemorySegment which can allow it to be used as a Stream for reading purposes, thus the Buffer property may give you a stream with CanWrite = false and CanRead= true, which is already in representation of the order of data within Depacketized.

Thus when you call Depacketize, only the new data will be added to the 'Buffer' and without having to re-create it for out of order packets.

This will also eliminate any memory copying as the data can be read in place directly from the MemorySegment even when being read through the Stream facade.

In short if there is packet loss then it needs to be addressed as the decoder will probably not be able to decode the result or it will have artifacts.

If there is no loss then all frames should have IsMissingPackets = false and HasMarker = true.

In those cases the decoder needs the data even if the frame doesn't contain a KeyFrame...

For IsKeyFrame I have implemented the logic necessary to detect IDR NAL, this is basically the correct case for 95% of the time however it's also possible depending on how encoding is performed if the FULL IDR comes very often, sometimes a alternate type of Frame can be sent when the full IDR is not needed because there is little motion but this type of frame can still be decoded on it's own.

That needs to be checked by checking the Slice header to determine the type of slice contained which I don't have implemented yet because that's not really the responsibility of the Rtp layer.

Since that logic is also useful (slice header parsing) for creating a compliant bit stream with the correct full or short start codes I will probably end up incorporating it somewhere in the library to both round out the logic of key frame detection and properly forming the bit stream given to the decoder.


I don't think you really need to set the IsKeyFrame or SyncFrame flags unless you are encoding, when decoding those flags will be set for you by the decoder when it reads the buffer. There is no reason to force this otherwise.

Please let me know if that makes sense and I will hopefully get some updates in later today!
Marked as answer by juliusfriedman on 4/12/2016 at 9:06 AM
Coordinator
Apr 12, 2016 at 7:58 PM
Edited Apr 12, 2016 at 7:59 PM
Posted some updates...

Still not done yet and I am not sure if I will need to provide as much work as I though into IRtpFrame or even at all... although for the best performance I probably should and then modify the RtpClient to use the IRtpFrame interface to ensure that.

I exposed JitterBuffer but to be used with the derived frame types effectively, e.g. sharing data or otherwise you would have derive it for JitterBufferH264 or JPEGJitterBuffer. (For now anyway)

I am going to look more into Depacketization and the API on RtpFrame and hopefully at least get that and some of the other issues sorted out so I can finally close a few of them.
Marked as answer by juliusfriedman on 4/12/2016 at 11:59 AM
Apr 13, 2016 at 7:45 AM
Edited Apr 13, 2016 at 8:36 AM
Hi Julius,

I now changed my DecodeFrame which is called by your RTPClient OnFrameChanged to:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                frame.Depacketize();
                if (frame.HasDepacketized)
                {
                    Decode(frame.Buffer.ToArray(), false, false, false);
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }
        }
Still works this way and seems to be Minimum faster / less artefacts, but not much.
How do I get Keyframe now? It works without Setting any flags (like keyframe), but I still Need to find out why the artefacts and the delay. Is it a Keyframe when Frame.HasMarker?

I also try to find a better way using the memorystream directly...

Still examining ... :-)

PS: Again it seemed I deployed but had the old Version running. Strange. The above Code is not working...
With your new RTPClient it is Minimum faster. Still trying to get it working with the Frame directly instead of the RFC6184Frame.
Apr 13, 2016 at 8:48 AM
Edited Apr 13, 2016 at 9:22 AM
When you want to use the data without creating a MemoryStream or copying it you can simply just pass these buffers to the decoder, this is the most optimal as it doesn't require any copying of data.
I do not find anything on the Frame instead of the Buffer. How do I get These? There is no Data / Memorysegment or something. In your source I only find non public Depacketized as a sorted list of Memorysegment...

Or in SourceFrameChanged can I use tc.ContextMemory from the TransportContext directly to feed it to my Decoder?
Apr 13, 2016 at 9:40 AM
Edited Apr 13, 2016 at 12:56 PM
One Thing I found out:
I added to OnSourceFrameChanged
                    if (!frame.HasMarker || frame.IsMissingPackets)
                    {
                        ClientLogger.Instance.Log("Attention! Data loss in RTPClient OnSourceFrameChanged");
                    }
                }
isMissingPackets is 0-4 times each second, HasMarker appears around 20-40 times each second.
If HasMarker is only true for the last packet, this is okay I guess?

Is the value for isMissingPackets okay this way?

My renderung is called around 8 times a second (between 6 and 12 most times 8)

I have around 25-30 Times a OnSourceFrameChanged each second with "Final"

I have around 15 Times a Frame there each second with "Complete"

And the IsKeyFrame only appears one time at the start and sometimes many seconds later.
I cannot tell if this is correct and the Gateway really sends that few keyframes, or if it is a bug?

I will try to find out or config the Gateway to send more keyframes. I was promised to get a possibility to configure the stream myself today.

I was using the old Code with OnSourceFrameChanged in Debug mode:
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized)
                    {
                        if (profileFrame.IsKeyFrame())
                            ClientLogger.Instance.Log(string.Format("KeyFrame at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));

                        Decode(profileFrame.Buffer.ToArray(), profileFrame.IsKeyFrame(), profileFrame.IsComplete, profileFrame.ContainsPictureParameterSet || profileFrame.ContainsSequenceParameterSet || profileFrame.ContainsSupplementalEncoderInformation); // || profileFrame.ContainsCodedSlice
                    }
                }
Coordinator
Apr 13, 2016 at 1:36 PM
It's probably not a bug it's just a configuration, an entire frame only never really needs to be sent during the initial connection and can come much later.

E.g. an I frame may only arrive one time every 5 minutes or more in a stream where the bandwidth is very low and has little movement, the sps and pps may never be sent again besides the sdp.

This is configured on the camera or gateway as GOP size / duration or Bandwidth setting.

You may want to try increasing the recieve buffer size property on the socket, it may help with loss.

Please also see the Contains multiple method which can search the contained nal units in one pass, here you make 4 passes, I will probably move those properties to extension methods soon.

I will also expose the buffers from the frame, they are in the Depacketized list.

I will also hopefully work more on seperating RFC6184FRAME from the RtspServer.

Thanks again!
Marked as answer by juliusfriedman on 4/13/2016 at 5:36 AM
Apr 13, 2016 at 2:16 PM
Edited Apr 13, 2016 at 3:32 PM
Hi Julius,

I didn't set Receive buffer size yet, how can I?


In your RTPClient Code? Which value can I try?
        //Don't buffer receive.
        socket.ReceiveBufferSize = 0;
I don't care if I use RTPFrame directly or RFC6184FRAME, if I try RTPframe you should please tell me how to.

I now have an Interface where I can read and set some Parameters:

BitRate: 1000 (kbps)
Width 640
Height 480
GopSize 10 (changes seem to Change artefacts volume a bit)
IDRInterval 25
QuantParam: 0 (seems to be Quality from 0 to 51 if Bitrate is 0. I tried to set Bitrate to 0 with various value here but did not get a Picture anymore or only a freezed White one)
intraRefresh 0
sliceSize 0
configInterval 1
mtu 1400 (I tried to increase it but didn't see a Change)

I will now try if it gets better with changing those, if you got some ideas which I should try please let me know. Still a 1-2 second delay and many artefacts.

PS: I am not sure how to set buffer receive size, I changed in your rtpclient Code:
                //Don't buffer receiving
                socket.ReceiveBufferSize = 1000000;
Rebuild, copied the DLL and restarted but see no big Change...
Please also see the Contains multiple method which can search the contained nal units in one pass, here you make 4 passes, I will probably move those properties to extension methods soon.
I am not sure where I should Change something?
I still have a funny Feeling aboud the missing keyframes. Aren't they the cause for the heavy artefacts?

Thanks for your help :-)
Coordinator
Apr 13, 2016 at 3:38 PM
Edited Apr 13, 2016 at 4:01 PM
socket.ReceiveBufferSize can be changed during runtime...

There is a an extension methods which can do this for your when you need it
foreach (var tc in sender.Client.GetTransportContexts())
            {
                Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(tc, bufferSize);
            }
Obviously if this context shares a socket then you are affecting all sockets which use this receive buffer value and under TCP usually all contex's share a socket so this call would be better in that case, the RtspClient already does this for you based on the given buffer size anyway...
Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(rtspClient, bufferSize);
IDRInterval is what your looking for probably, it coincides with your indications about 'final', decreasing that value can provide more frequent IDR frames.

The MTU (and receive buffer) should only really be changed on local LAN traffic, using it over the internet would likely not yield different results from the defaults.

As far as using RtpFrame vs RFC6184Frame, that's an application requirement for the most part.

The application needs to know how to the decoder expects it's data, either in RTP / NAL form, bitstream form or some other form.

All decoders have to accept the Bitstream form so that is why the classes help you get to that form.

You can decode the NAL units and get a picture but your experiencing packet loss which obviously causes artifacts, you shouldn't be experiencing packet loss unless your exceeding your bandwidth or taking to long to read data from the socket.

The way packet loss is usually handled is by concealing the errors of the loss, the other way is to use a FEC Extension to attempt to restore the lost data.

If your decoder can handle fragmented NAL units than you can just give it each packet you packets Payload.Array, the offset into that array is is given by 'packet.Payload.Offset + packet.HeaderOctets' and the length of the payload data is given by 'packet.Payload.Count - (packet.HeaderOctets + packet.PaddingOctets)'

Such a decoder will be able to determine the frame by reading the slice headers and frame number.

If your looking for a way to display data reliably why don't you wait until 'final' is true and then store the completed frame somewhere.

When you have more than X completed frames or T time elapses take N amount of completed frames where N is the number of frames older than T and display them.

This is typically called a Playout / Decoder Buffer and typically the JitterBuffer provides packets to a Playout / Decode Buffer at certain times based on both the Network jitter / packet loss.

Most application usually set a delay and at this delay they re-order packets into frames and if frames are incomplete they ordering stops and only the completed frames and utilized.

This library just provides the packets and frames as they come, it helps where it can to ensure that there is re-ordering done at the packet level and additionally provides frame events when desired such that individual packet events don't need to be monitored.

In some cases you may only want to have 1 packet at a time in memory, you wouldn't have frame events in that case and it would be up to you to do something with the packets each time they arrived.

With frames it's a little easier to manage, especially since most people want to decode and this provides a facility for going from packets to frames to a stream of data which can used outside of RTP.

If you want to do something like ForwardErrorCorrection then you potentially don't really need frame event's, you could just use the individual packet events and store the FEC data you need.

I went on a bit of a tangent there, hopefully not too much.

let me know if that answers your questions.

Also please note that I think there is a recently introduced bug where I am causing artifacts when the sequence number wraps, I am going to look into this and fix it soon.
Marked as answer by juliusfriedman on 4/13/2016 at 7:38 AM
Apr 14, 2016 at 6:35 AM
Edited Apr 14, 2016 at 7:05 AM
Hello Julius,

thanks for the info, I am still reading...

I am using a RTPClient and this does not work:
                RTPClient = Media.Rtp.RtpClient.FromSessionDescription(sessionDescription, rtpPort: 0, rtcpPort: 0);
                //Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(RTPClient, Globals.BufferSize);
Can I do it this way?

public static int BufferSize = 12288; // RTPClient Socket Standard msdn 8192 (8k)
    internal void OnSourceFrameChanged(object sender, RtpFrame frame = null, RtpClient.TransportContext tc = null, bool final = false)
    {
        if (frame != null)
        {
            Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(tc, Globals.BufferSize);
I feel like the delay is better now, but still heavy artefacts.
Also please note that I think there is a recently introduced bug where I am causing artifacts when the sequence number wraps, I am going to look into this and fix it soon.
This sounds good. Yes I understood most of your explanantions, thanks. The Problem is, the source is a doorbell with a camera. So the delay isn't allowed to be too big. Right now it is around a second and okay with too heavy artefacts. I cannot let the delay be bigger...

Thanks again Julius!
Apr 14, 2016 at 12:19 PM
Just an info,

in order to see if I Need a jitter buffer I added temporary this log called from OnSourceFrameChanged:

ClientLogger.Instance.Log(string.Format("Lowest {0} and highest {1} Sequence Number", frame.LowestSequenceNumber, frame.HighestSequenceNumber));


This brings for example the following result:

Lowest 20648 and highest 20654 Sequence Number
Lowest 20648 and highest 20655 Sequence Number
Lowest 20648 and highest 20656 Sequence Number
Lowest 20648 and highest 20657 Sequence Number
Lowest 20658 and highest 20658 Sequence Number
Lowest 20658 and highest 20659 Sequence Number
Lowest 20658 and highest 20660 Sequence Number
Lowest 20648 and highest 20657 Sequence Number
Lowest 21143 and highest 21143 Sequence Number
Lowest 20658 and highest 20660 Sequence Number
Lowest 21147 and highest 21147 Sequence Number
Lowest 21147 and highest 21148 Sequence Number
Lowest 21143 and highest 21143 Sequence Number
Lowest 21151 and highest 21151 Sequence Number
Lowest 21151 and highest 21152 Sequence Number
Lowest 21147 and highest 21148 Sequence Number
Lowest 21155 and highest 21155 Sequence Number
Lowest 21155 and highest 21156 Sequence Number
Lowest 21151 and highest 21152 Sequence Number
Lowest 21160 and highest 21160 Sequence Number
Lowest 21160 and highest 21161 Sequence Number
Lowest 21155 and highest 21156 Sequence Number
Lowest 21166 and highest 21166 Sequence Number
Lowest 21160 and highest 21161 Sequence Number
Lowest 21171 and highest 21171 Sequence Number
Lowest 21166 and highest 21166 Sequence Number
Lowest 21176 and highest 21176 Sequence Number
Lowest 21176 and highest 21177 Sequence Number
Lowest 21171 and highest 21171 Sequence Number
Lowest 21182 and highest 21182 Sequence Number
Lowest 21182 and highest 21183 Sequence Number
Lowest 21176 and highest 21177 Sequence Number
Lowest 21197 and highest 21197 Sequence Number
Lowest 21197 and highest 21198 Sequence Number
Lowest 21182 and highest 21183 Sequence Number
Lowest 21203 and highest 21203 Sequence Number
Lowest 21203 and highest 21204 Sequence Number
Lowest 21197 and highest 21198 Sequence Number
Lowest 21208 and highest 21208 Sequence Number
Lowest 21203 and highest 21204 Sequence Number
Lowest 21213 and highest 21213 Sequence Number
Lowest 21208 and highest 21208 Sequence Number
Lowest 21219 and highest 21219 Sequence Number
Lowest 21219 and highest 21220 Sequence Number
Lowest 21213 and highest 21213 Sequence Number
Lowest 21224 and highest 21224 Sequence Number
Lowest 21224 and highest 21225 Sequence Number
Lowest 21219 and highest 21220 Sequence Number
Lowest 21229 and highest 21229 Sequence Number

In a few cases there seems to be a step back or something arriving not in order...
Coordinator
Apr 14, 2016 at 12:29 PM
Edited Apr 14, 2016 at 12:33 PM
Weird, there are seemingly duplicate packets and out of order.

Can you post a Wireshark capture so I can check a few things? I think there are multiple frames with different timestamps or payload types, that's the only way I could easily make sense of that.

I am going to add some other tests to ensure reordering is correct also.

I imagine this doorbell is on wifi... and your using udp?

Have you tried tcp?

The reordering is not to bad but the duplicate should have been rejected by the 3550 algorithm.

I think, possibly, if many of those duplicate exceptions occur then this will cause needless exceptions to be thrown, increase memory usage and delay.

I will see about improving Add for certain cases shortly additionally I will see about not throwing an exception for duplicate packets.
Marked as answer by juliusfriedman on 4/14/2016 at 4:29 AM
Apr 14, 2016 at 1:34 PM
Edited Apr 14, 2016 at 1:34 PM
Hello Julius,

here is a wireshark log I just created:
https://www.dropbox.com/s/6x9l0x6a58td657/Wireshark-AndroidClient-WithVideoStreamWasOkWithArtefacts.pcapng?dl=0

the Video was visible and a small delay with heavy artefacts.

Right now the SIP Client can only Register on the Gateway using UDP, but later should be switched to TCP I was told (asterisk is on the Gateway but I have no experience / Knowledge about it).
The RTP Client should use UDP I was told.

I will now Trun off OnSourceFrameChanged, turn on RTPPacketReceived and have a look at the Sequence numbers for that case.

Thanks!
Coordinator
Apr 14, 2016 at 1:42 PM
Cool, I will check this capture further and let you know what I find.

Keep me updated as well!
Apr 14, 2016 at 1:42 PM
Got a RTP packet, SequenceNo = 1 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18392 PayloadType = 99 Length = 17
Got a RTP packet, SequenceNo = 2 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18393 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 3 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18394 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 4 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18395 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 6 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18396 PayloadType = 99 Length = 591
Got a RTP packet, SequenceNo = 7 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18397 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 8 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18398 PayloadType = 99 Length = 352
Got a RTP packet, SequenceNo = 9 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18399 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 10 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18400 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 11 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18401 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 12 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18402 PayloadType = 99 Length = 369
Got a RTP packet, SequenceNo = 13 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18403 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 14 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18404 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 15 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18405 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 16 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18406 PayloadType = 99 Length = 94
Got a RTP packet, SequenceNo = 17 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18407 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 18 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18408 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 19 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18409 PayloadType = 99 Length = 50
Got a RTP packet, SequenceNo = 20 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18410 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 21 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18411 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 22 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18412 PayloadType = 99 Length = 1046
Got a RTP packet, SequenceNo = 23 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18413 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 24 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18414 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 25 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18415 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 26 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18416 PayloadType = 99 Length = 803
Got a RTP packet, SequenceNo = 27 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18417 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 28 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18418 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 29 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18419 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 30 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18420 PayloadType = 99 Length = 1400


later


Got a RTP packet, SequenceNo = 18628 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 131 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18629 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 132 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18631 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 134 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18632 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 135 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18635 PayloadType = 99 Length = 21
Got a RTP packet, SequenceNo = 136 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18636 PayloadType = 99 Length = 17
Got a RTP packet, SequenceNo = 137 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18637 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 138 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18647 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 139 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18648 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 140 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18652 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 141 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18653 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 142 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18655 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 143 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18656 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 144 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18657 PayloadType = 99 Length = 1216
Got a RTP packet, SequenceNo = 145 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18658 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 146 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18659 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 147 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18661 PayloadType = 99 Length = 14
Got a RTP packet, SequenceNo = 148 PayloadType = 0 Length = 172
Got a RTP packet, SequenceNo = 18662 PayloadType = 99 Length = 1400
Got a RTP packet, SequenceNo = 149 PayloadType = 0 Length = 172


Later I see a few are missing here? But the order seems to be correct...
Coordinator
Apr 14, 2016 at 3:18 PM
The order of the frames look correct too, I can't yet find an instance where I have incorrectly ordered packets.

One thing your not showing is timestamp and marker packets, some packets might looked dropped but not exist or some of those packets might bear different timestamps.

I have to look further at your capture to really tell.

Can you show the code related to frame handling which is giving you trouble?

I imagine in the event handling that you are checking for IsComplete before attempting to Depacketize.

If Depacketize is called before the frame is complete and then again after there is the possibility for the data to come out of of order because of how I input the data into the sorted list.

You must either not call Depacketize until your sure no more packets are coming for that frame or I have to make a way to dispose the buffer and recreate it or come up with a better method of creating the buffer.

I think what is happening is that your calling Depacketize multiple times like I instructed you too, but I haven't completed the logic necessary to make sure that it works in all cases as desired.

I think the key here will be making and out of order packet test which uses out of order interleaving in the format and ensure that the data is how it's supposed to be.

E.g. in cases where the data is in rtp order I don't even need to worry about data interleaved except if packets are missing.

When the data needs additional depacketization ordering e.g. interleaving where the data in packet 0 comes before the data in packet 65535 but only in depacketization then I should be able to also handle this by using that order in the Depacketized list as I do now but the packets have to be in rtp order first.

Your not using that type of teansfer here nor is a majority of others but to be complete it needs to be able to be handled.

I will create a few more tests and see what I can come up with.

One way I can think of to solve the problem of the Buffer (Stream) is to create a Stream which is sourced from a List or sortedlist of memory segments.

This will allow the Buffer to have memory inserted in any order necessary and still look like a stream.

I am going to work on that some more since the PacketKey solution isn't viable because it's too local to the individual packets.
Marked as answer by juliusfriedman on 4/14/2016 at 7:19 AM
Apr 14, 2016 at 3:28 PM
Edited Apr 14, 2016 at 3:29 PM
For creating the wireshark and the first log with the sequence numbers in wrong order I used this Code:

My new buffersize (makes delay better but still many artefacts):
public static int BufferSize = 12288; // für RTPClient Socket Standard ist laut msdn 8192 (8k)

Getting a Frame:
        internal void OnSourceFrameChanged(object sender, RtpFrame frame = null, RtpClient.TransportContext tc = null, bool final = false)
        {
            if (frame != null)
            {
                Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(tc, Globals.BufferSize);

                if (Globals.DoDetailedLog)
                    ClientLogger.Instance.Log(string.Format("Got a RTP Frame PacketCount {0}  IsComplete {1}  IsFinal {2}  Time: {3}:{4}", frame.Count, frame.IsComplete, final, DateTime.Now.ToString(), DateTime.Now.Millisecond));
                if (tc.MediaDescription.MediaType == MediaType.video) // Wenn man hier && final abfragt, wird die Verzögerung größer aber das Bild etwas besser
                {
                    if (VideoFrameReceived != null)
                        VideoFrameReceived(this, frame);
                    //if (!frame.HasMarker || frame.IsMissingPackets)
                    if (frame.IsMissingPackets)
                    {
                        ClientLogger.Instance.Log(string.Format("Attention! Data loss in RTPClient OnSourceFrameChanged at Time: {0}:{1} ", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                    }
                }
                else
                {
                    if (Globals.DoDetailedLog)
                        ClientLogger.Instance.Log(string.Format("Got a RTP Frame from type {0}", tc.MediaDescription.MediaType.ToString()));
                }
            }
            else
            {
                ClientLogger.Instance.Log("NULL - RTP frame received");
            }
        }
With Events this is passed to here:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                //frame.Depacketize();
                //if (frame.HasDepacketized) // && !frame.IsMissingPackets
                //{
                //    Decode(frame.Buffer.ToArray(), frame.HasMarker, frame.IsComplete, false);
                //}

                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    if (Globals.DoDetailedLog)
                        ClientLogger.Instance.Log(string.Format("Lowest {0} and highest {1} Sequence Number", frame.LowestSequenceNumber, frame.HighestSequenceNumber));
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized)
                    {
                        if (profileFrame.IsKeyFrame())
                            ClientLogger.Instance.Log(string.Format("KeyFrame at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));

                        Decode(profileFrame.Buffer.ToArray(), profileFrame.IsKeyFrame(), profileFrame.IsComplete, profileFrame.ContainsPictureParameterSet || profileFrame.ContainsSequenceParameterSet || profileFrame.ContainsSupplementalEncoderInformation); // || profileFrame.ContainsCodedSlice
                    }
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }
        }
Apr 14, 2016 at 3:32 PM
Edited Apr 14, 2016 at 3:37 PM
You must either not call Depacketize until your sure no more packets are coming for that frame or I have to make a way to dispose the buffer and recreate it or come up with a better method of creating the buffer.
if (profileFram.IsComplete) only then Depacketize() or something like this?

Or maybe do I Need a JitterBuffer to re-arrange the order?
Coordinator
Apr 14, 2016 at 3:40 PM
Edited Apr 14, 2016 at 3:48 PM
Yes, that's correct I believe.

No you don't need a jitter buffer especially if your not playing the data, your giving data to a decoder if that decoder is also controlled by you then you should only be calling decode and displaying data already given to the decoder.

Now if your are playing the data and it's not copied and needs to be present at play time then yes, a jitter buffer or atleast something to hold the data of completed frames would be useful to hold onto until you needed to decode.

E.g. If packets are lost the jitter buffer will consume memory and you still won't be able to play the data what did it benefit you? Now you have to prune it..

If you wait for that then you can call Depacketize safely and use the has depacketization etc.

The only other thing you need to handle is when 'final' is true.

This means that frame is about to be disposed.

If you have already handled it then it doesn't matter, if not then you the option of looking at the types of nal it has and determine what you want to do with them.

Finally, don't check IsComplete twice as that will cause more delay., also don't use the individual contains properties to check a series of nals, use the Contains overload to check many types in one iteration.

If you need to check for multiple things than just loop the contained nal types and check there for the types you need and set the values accordingly.

With those changes you will have very little delay and no artifacts beyond those caused by loss.
Marked as answer by juliusfriedman on 4/14/2016 at 7:49 AM
Coordinator
Apr 14, 2016 at 4:02 PM
Edited Apr 14, 2016 at 4:09 PM
Just for the record, I have used the frame method with no jitter buffer on mpeg4, h264, jpeg and few other formats without any issues.

When there is loss there is loss, usually with the bit rate and GOP set according to the bandwidth actually available for consumption there is minimal loss and error concealment works to reduce any artifacts within a few milliseconds.

Adding a jitter buffer didn't do anything for the problem because the reordering was not sufficient. When reordering is sufficient that you need a jitter buffer than the real problem is the undelying network being exceeded...

One other potential issue I see is that your possibly causing the loss.

Your calling decode on the same thread where the packet is recieved.

This also can be improved by threading the events of the RtpClient but I haven't completed that for use yet.

It's trivial to do at the frame level but much harder without them.

I can see about getting that added soon also but you could also just give the decoder the data already written out from the buffer, e.g. you write it to a file, why do you need to decode it on the same thread? Just read that file from another thread..

For instance even if I thread the events one could just start consuming the entire cpu with long running tasks on each thread..

I assume that the data being available and used at the time of the event would be more then sufficient, why do you think it needs to live longer e.g. with a jitter buffer or some other buffer?

I think you will start to realize that root of the issue is that your doing too much work in one thread, I would also expend much more memory trying to solve something with threads that really has no benefit.

I can only make the library so efficient :)
Marked as answer by juliusfriedman on 4/14/2016 at 8:02 AM
Apr 14, 2016 at 5:44 PM
Edited Apr 14, 2016 at 6:18 PM
Hi Julius,

the jitter buffer was just an idea, if I do not Need it that's okay. I just Need to reduce the artefacts. Speech is good, delay is okay, just way too much artefacts.
The device is an Android 4.4 or above, right now it is connected to the Gateway with a wlan with good Signal strength.
The only other thing you need to handle is when 'final' is true.
So do something in OnSourceFrameChanged only if Final is True? And in the other case do nothing? That I already tried, I think I didn't see much Change.

Maybe my hack to have my own RFC6184Frame without the drawing class or the wrapping bug you mentioned could also be the Problem?

I am at home now and will try tomorrow (in 13 hours :-) ).

Thanks a lot for your help!
Coordinator
Apr 14, 2016 at 6:46 PM
No problem.

Audio data is a lot smaller and easier to decode in most cases.

I still think you should only be using the memory stream to copy it to your local buffer e.g. in the application somewhere and every so often when KeyFrame is true I would then call Decode with the byte[] of the ApplicationBuffer.

So it would go like this:

Packets arrive

//This is the last time your going to see this frame and the frame is compelete
Frame Changed (final) IsComplete == true

//This is the last time your going to see this frame and the frame is not complete
Frame Changed (final) IsComplete == false

Those are the two scenarios which you are talking about.

A jitter buffer will help if the if next packets belong to the frame in which the 'final' event was just received.

Hopefully we are in agreement and understanding about this.

Thus we can focus on the Depacketize calls.

I will attempt to elaborate:

If a frame is not complete, ideally you or anyone should still be able to call Depacketize, the frame SHOULD be smart enough to know if the packet was already Depacketized, I am working on this and I need to ensure that my logic is correct. I can either use HashCode (but I should optimize that implementation [as well as equals]) or I can just use trival signed math but the sequence gaps can never be larger than allowed by the RFC (which is not a problem until you do ZRTP anyway which is fine)

The point being you should always be keeping a copy of the data somewhere on the 'final' event.

If you want to remove the packet for some reason and free it's depacketiztion for some reason, (this is more likely in retransmission streams which are a similar to redundant frames in some ways) then this is possible but I have to work on a solution for this also as when a packet is removed it's memory is still in the buffer. (This is where the SegmentList would come into play or overloads for Depacketize which forced a copy... or extension methods).

None the less in 90% of the cases you will never remove a packet but you may want to dispose the frame it belongs to or the RtpClient will call it anyway, in such cases you either have to make a copy of the memory in the 'Buffer' or you have to ensure 'ShouldDispose' is going to be false 'SetShouldDispose(instance, false)' and store it somewhere for as long as you want.

Hopefully we are on the same page about this now.

Now how to handle the loss?

When there is gaps you can just simply choose to feed them to the decoder before you dispose the frame or before you give the decoder data which comes AFTER that frame.

E.g. if you decide to do this you can keep X amount of frames.

When X is approached you can feed the decoder with the Buffer of the frames but only up to the point of gaps.

Dispose and remove those packets.

Receive more frames and repeat.

This is the idea of the Jitter Buffer, it will eventually allow you to do this based on memory, time or other constraints and without have to define a List of <X> and prune it.

I could possibly work around this by having a 'FinalFrame' which is cycled along with CurrentFrame and LastFrame but it's not really needed since with bad reordering this will just flip flop instances anyway.

Furthemore this typically doesn't matter except in the absolute LOWEST of bitrates, and not for you either though especially since you have such a high rate of IFRAME's.

You can safely discard frames with gaps because the next set will have a complete sequence anyway.

The problems with your artifacts and delay are as I stated above, your doing 3 * the number of contained nals iterations to see if it's a key frame, your calling IsKeyFrame along with that and your calling depacketize multiple times.

After you fix that you still have the issue of calling decode in the same thread as the receive, you dont need to do this but if you want to get the best quality then you do, but if you want the best delay with small amounts of artifacts then you don't.

You could also take a hybrid approach but that will also cost memory...

None the less the way you can do this is as I just stated, You can maintain a memory stream 'ApplicationBuffer' which has the desired capacity and re-use it as you need to for each 'RFC6184Frame' you decide you want to complete.

From the decoder thread you should have a WaitHandle or something which is called when you fill your 'ApplicationBuffer' and the call decode on it to update the interface.

That is how it needs to be to ensure you don't block receiving because if you take too long then you have the potential to miss more data..

To verify this, just run the TestRtspClient against your stream for over 10 minutes.

Then quit the test and see the metrics.

You should have encountered frames with missed packets but frames still missing packets should be 0.

Thus this proves that even with re-ordering you will be able to handle the data rate of the stream, it's a method I use to ensure that a certain resolution and network condition will work before I start doing anything usually.

If you can't get a solid stream just processing the events then you definitely can't decode it.

After that point you can build the data transfer to the decoder as you need to using a Queue or a List or whatever you want but you really don't need it the way your stream is configured, you just need to ensure your not blocking the decoder and the decoder is not blocking you.

I typically just pass the data to the decoder at the last possible point (final == true) and don't ever require an intermediate buffer at the expense of very small artifacts (during a lot of movement) every once in a while.

Another 'Hint', Typically I also use 'IsComplete' to determine for thumbnail purposes... if the frame isn't complete you can still decode it in a lower quality or controlled manner which will not interfere with the 'main' decoding, e.g. a preview.

E.g. when IsComplete is false I can attempt the decode and update the interface or output if I want and if it becomes true then I decode it again with higher quality on the main stream....

There isn't an API for this in MediaCodec I see available out of the box, typically you would have to work with the internals of a decoder to achieve this, you could also take two decoder instances and combine the outputs of them during idle time for higher quality...

That's in an aspect of the player....

A decoder decodes and player plays back, scales, does further post processing etc.

Hopefully you agree and we can then focus on the use of the API, e.g. how it could be better.

:)

Thanks again for your help and making these discussions because I am sure they will be valuable to others also.
Marked as answer by juliusfriedman on 4/14/2016 at 10:46 AM
Coordinator
Apr 14, 2016 at 7:42 PM
Edited Apr 14, 2016 at 9:17 PM
111988 further improves the Depacketize call so that it can be called many times.

Please note that if you create the the Buffer by accessing the Buffer property without checking HasBuffer then when and if new packets are added and you happen to call depacketize again, the data will be correctly laid out in memory but the Buffer property will NOT have the data in the same order.

I am going to work by implementing a SegmentStream to address this but I wanted you to see where I am going and why and hopefully even revise my way of thinking to some extent, but I believe you will agree it has validity and doesn't add too much complication while still allowing the most effective use of memory and scenarios to allow development with therein.

The goal will be that Buffer can be accessed many times, many many times, potentially once for every received packet.

Depacketize is called if desired each time.

When Buffer is accessed it will return the instance which references the same collection in the frame 'Depacketized'

You or anyone will also have the ability to persist a write to the Depacketized memory such that you can dispose every single packet right after you get it and the frame will still have a copy of the data it needs to be decoded....

I think this is a powerful concept and doesn't add much memory overhead or complexity and additionally allows the memory to be referenced as a stream which can be read or written to if desired (which can be seeked). This allows other derivations such as a ReadOnly SegmentStream or OnlyReadForwardSegmentStream etc which also have useful purposes outside of Rtp.

It also has the benefit that each part of depacketization can be directly accessed as an array if needed where the Stream API is not compatible.

This is the best of both worlds and I don't think it can get any better than that...

Let me know what you think...
Marked as answer by juliusfriedman on 4/14/2016 at 12:22 PM
Apr 15, 2016 at 8:19 AM
Edited Apr 15, 2016 at 10:28 AM
Hello Julius,

thanks a lot for the info!
That was a lot of Input, I will try to start one step after another and try.
I am working with the new Version net7mma-111988, thanks for the update. Still Little delay but heavy artefacts, sometimes the bottom area of the Picture does not Change at all or slower than the top area.

The good news: Using OnSoruceFrameChanged there are no ducplicate packets anymore and the order is always correct. Just sometimes something missing still.
When there is loss there is loss, usually with the bit rate and GOP set according to the bandwidth actually available for consumption there is minimal loss and error concealment works to reduce any artifacts within a few milliseconds.
I have Bitrate set to 1000 kbps, MTU 1400. The buffersize is 12288 currently. GOP is 10. I tried some different values but did not find a way to make it better. What values would you recommend to test with? It is still Android 4.4 above with wlan and good bandwith.

I still wonder if my General way using OnsourceFramechanged is correct or maybe this is the Problem with the loss?
  1. OnSourceFrameChanged if (tc.MediaDescription.MediaType == MediaType.video && final) goes to my Decode
  2. Decode each time creates a new RFC6184Frame from the given Frame. The class is derived from RTPframe and copied to my Project because I cannot use System.Drawing. using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
  3. Depacketize profileFrame.Depacketize(); and if HasDepacketized give the profileFrame.Buffer.ToArray() to my Decoder each time...
The interesting Point here: if I add HasBuffer I have no Picture anymore. It is always false.
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized && profileFrame.HasBuffer)
                    {
When I use && profileFrame.Buffer != null && profileFrame.Buffer.Length > 0) instead of HasBuffer it works.

Now for some testing on your suggestions: I am working with OnSourceFrameChanged still for the Moment.
The only other thing you need to handle is when 'final' is true.
  • Only do something if Final or not: I see no big Change
Finally, don't check IsComplete twice as that will cause more delay., also don't use the individual contains properties to check a series of nals, use the Contains overload to check many types in one iteration.
  • Using IsComplete one times or not, using it to flush or not, using IsKeyFrame to set the flag in the codec manually or not does also do no big Change
For your Suggestion using own thread:

I will Keep the Event and feeding the Decoder InputQueue like it is. But I will create a Background worker now which checks the Decoders Output buffer and renders.
Coordinator
Apr 15, 2016 at 12:58 PM
The settings seem okay for the camera, I don't know if the GOP is frames or seconds but you will want to adjust it to the framerate or not less than half the framerate.

I would personally unlimited the bit rate and use a FPS setting resulting in a variable bit rate if possible.

Constant bitrate will not be better where loss is occurring.

Maybe also consider changing the quality and resolution to suit the bandwidth available e.g. 54 mbps / 3

As for HasBuffer I need to look into that, it's possible that my API is still not completely ready for HasBuffer.

And once I finish the list of segments then Buffer will be the same as Depacketized and I will probably even be able to remove Depacketized in favor of the List.

We will discuss that after everything is working.

Your right final is not a big change. I don't like having it optional but having a seperate event is more complicated to code for in most cases.

That combined with the other changes I recommend should get this working without artifacts and delay.

By using the background thread you will likely also remove the artifacts.

Let me know of that's not the case.

Once your done we can see about how to optomize further by using arrays directly etc and work on the API.

I will have an update later today which should be good enough to show you what I mean.
Marked as answer by juliusfriedman on 4/15/2016 at 4:58 AM
Apr 15, 2016 at 2:11 PM
Edited Apr 15, 2016 at 2:12 PM
Hello Julius,

I kept the Event handling and feeding Codec inputbuffer the way it was before,
and trying to get info from Codec outputbuffer and Rendering in a backgroundworker started async.

Maybe the top of the Picture was a bit better and the bottom even worse with missing changes or artefacts, but maybe there was no Change. It is Minimum or no Change for that, I think the Problem is somewhere else. But I can Switch now between same thread and Background worker for upcoming Tests. But I Need to get this working very soon... thanks again for your help!
Coordinator
Apr 15, 2016 at 2:56 PM
To test this out what you need to do is create a plain .264 file with the same output your feeding the decoder.

If that file can be played back without artifacts on another decoder then the problem is in the Decoder implementation on MediaCodec or the way your feeding the decoder data or the flags your passing or some combination therein.

I have to ask as I am completely not sure, what is not working?

You get the Rtp packets?

You can Depacketize those RtpPackets using the RFC6184Frame class into a complaint bitstream?

The library seems to work to me...

If you need to remove the artifacts completely why don't you only give the decoder Complete frames and only on IFrame boundaries?

It's not a problem to help at all I just need to understand how I can help!
Marked as answer by juliusfriedman on 4/15/2016 at 6:56 AM
Coordinator
Apr 15, 2016 at 3:47 PM
I have only been able to decide how to handle the HasBuffer by checking CanRead...

It's difficult to imagine for me how anyone may want to use it and I don't have a lot of feedback.

From your example I can tell you would like to access 'Buffer', this is because I made it available before at the derived instance level of a RtpFrame.

Now I have it on the base class (RtpFrame) which is slightly more desirable in most cases I can see, in the case where you have to receive data out of order in the same frame and would like to obtain the depacketized data for one reason or another this creates an interesting problem because 'Buffer' is not an array (although it is), it's a Stream, specifically a MemoryStream.

This means if you access Buffer, PrepareBuffer will be called.

So what I suggest is to 'use' the buffer when you access it.

This way the next time it will be-recreated if you need it.

I am still working on possibly including the SegmentList / Stream. I have more of it's implementation done but it's not ready for use yet.

When it is done the only other benefit would be that the stream would never have to be disposed or created again just to add new data.

I have uploaded new code at 111989.

After everything is 'working' I can revisit trying to optimize further.

In the meantime I will be working on some other issues related to SDP and multicast.

Let me know if you still need help!

After things are working I can go back to working on the RtpFrame and Depacketization buffers etc.

Right now I don't want to keep changing it while your also in the middle of trying to work on something.
Marked as answer by juliusfriedman on 4/15/2016 at 7:47 AM
Coordinator
Apr 15, 2016 at 10:44 PM
I just uploaded my last code for the weekend @ 111992.

I put some notes in RFC2435 which shows what I mean about re-ordering not based on sequence number.

This is an important concept to grasp as in can be used in various ways, e.g. 3640 Interleaving, MTAP, JPEG FragmentOffset.

Currently the easiest way I know of to test this with real data is to make a test using the 'Reverse fragment offset order'

This means that I would make a JPEG Frame and reverse the data but not the sequence numbers.

E.g. read the frame and send the end data in sequence number 0 and the first data in sequence number X with FragmentOffset == 0 and the marker.

Or choose a random FragmentOffset to send from but keep the Rtp Sequences in order...

My library should handle that and it is complaint to the RFC as the RFC makes no restrictions on FragmentOffset's order....

I know that VLC / FFMPEG does not handle this and in fact even GStreamer does not...

:)

Let me know if you have any questions or feedback!

Take care and let me know if I can help further!
Marked as answer by juliusfriedman on 4/15/2016 at 2:44 PM
Apr 16, 2016 at 8:00 AM
Hello Julius,

I have no Hardware here at home and will wait until early monday. :-) I Need to get it working then...
Thanks a lot for your help and have a nice Weekend!
Apr 18, 2016 at 8:51 AM
Edited Apr 18, 2016 at 9:06 AM
Hello Julius,

I downloaded 111992 and am using this one now, thanks for the new Version. I deleted my own RFC6184Frame class.

You told me I can use the Frame itself now. So I can do it this way, is this okay?
        internal void OnSourceFrameChanged(object sender, RtpFrame frame = null, RtpClient.TransportContext tc = null, bool final = false)
        {
            if (frame != null)
            {
                Media.Common.ISocketReferenceExtensions.SetReceiveBufferSize(tc, Globals.BufferSize); // Can I Change the buffersize like this?

               if (tc.MediaDescription.MediaType == MediaType.video && final) 
                {
               if (VideoFrameReceived != null)
                            VideoFrameReceived(this, frame);
          }
        }
And then the Event VideoFrameReceived continues here, depacketize and give to my Decoder:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                frame.Depacketize();
                if (frame.HasDepacketized && frame.Buffer != null) // && !frame.IsMissingPackets
                {
                    Decode(frame.Buffer.ToArray(), false, false, false);
                }
            }
            catch (Exception ex)
            {
                System.Diagnostics.Debug.WriteLine("Exception in DecodePacket: " + ex.ToString());
            }
        }
Is this okay or am I wrong? Didn't look into the reordering etc yet. Need to have a Picture again first :-)
It seems the Decoder does not receive or recognice a Video Picture this way, I get no Picture anymore.
Apr 18, 2016 at 10:30 AM
Edited Apr 18, 2016 at 10:46 AM
Hi Julius,

I cannot get it working with the above Code, I never get a Picture. Please have a look if there is something wrong or missing in my short Code above.

So I returned to use my own RFC6184Media class again, which is yours but copied to my Project and derived by RTPFrame so I can use it without System.Drawing.

Then it works again, but I only see the starting Frame and it does not Change anymore.
I removed this new line in my copied RFC6184Media.cs:
if (Media.Common.IDisposedExtensions.IsNullOrDisposed(packet)) return;

Now again I see the whole stream again, but with heavy artefacts still.

I hope these two Infos help you. Help would be great and a advice if and how I should use RTPframe directly.

Right now I am again using it this way with my own copied class
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    if (Globals.DoDetailedLog)
                        ClientLogger.Instance.Log(string.Format("Lowest {0} and highest {1} Sequence Number", frame.LowestSequenceNumber, frame.HighestSequenceNumber));
                    profileFrame.Depacketize();
                    if (profileFrame.HasDepacketized)
                    {
                        if (Globals.DoDetailedLog && profileFrame.IsKeyFrame())
                            ClientLogger.Instance.Log(string.Format("KeyFrame at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));

                        //Decode(profileFrame.Buffer.ToArray(), profileFrame.IsKeyFrame(), profileFrame.IsComplete, profileFrame.ContainsPictureParameterSet || profileFrame.ContainsSequenceParameterSet || profileFrame.ContainsSupplementalEncoderInformation); // || profileFrame.ContainsCodedSlice
                        Decode(profileFrame.Buffer.ToArray(), false, false, false);
                    }
                }

            }
I will try the file or VLC Player now to check the artefacts with the working Version...
Thanks!

PS: Running the Version from friday with the older rtpclient and stuff from you, I get only at start a Java IllegalStateException from the Decoder but it works. But I have much Loss in my log from frame.IsMissingPackets in OnSourceFrameChanged. With the new rtpclient and same Code I seem to have less Loss, but I get this Java IllegalStateException from the Decoder (MediaCodec) more often it seems.
Coordinator
Apr 18, 2016 at 12:24 PM
I will check into it, I am busy today and tommorow but probably Wednesday or Friday.
Apr 18, 2016 at 12:56 PM
Can you please at least say if this was the way you meant to try with the Frame itself?
public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                frame.Depacketize();
                if (frame.HasDepacketized && frame.Buffer != null) // && !frame.IsMissingPackets
                {
                    Decode(frame.Buffer.ToArray(), false, false, false);
                }
Thanks :-)
Coordinator
Apr 18, 2016 at 1:05 PM
Pretty much, I would only check for is missing packets when I check final. E.g. if false == final then you don't have to do anything for the most part.

If your still having issues we can setup team viewer or something and try to get this solved when I have more time later this week.
Apr 18, 2016 at 1:23 PM
Edited Apr 19, 2016 at 12:55 PM
Thanks.

When I use the RTPFrame like in the last post to feed the Decoder I get no Picture and each time I feed the decoder

04-18 14:20:36.168 I/mono-stdout(21436): Exception occured in H264Decoder Decode: Java.Lang.IllegalStateException: Exception of type 'Java.Lang.IllegalStateException' was thrown.
Exception occured in H264Decoder Decode: Java.Lang.IllegalStateException: Exception of type 'Java.Lang.IllegalStateException' was thrown.
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/external/referencesource/mscorlib/system/runtime/exceptionservices/exceptionservicescommon.cs:143
04-18 14:20:36.169 I/mono-stdout(21436): at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/external/referencesource/mscorlib/system/runtime/exceptionservices/exceptionservicescommon.cs:143
at Android.Runtime.JNIEnv.CallIntMethod (IntPtr jobject, IntPtr jmethod, Android.Runtime.JValue* parms) [0x00064] in /Users/builder/data/lanes/3053/a94a03b5/source/monodroid/src/Mono.Android/src/Runtime/JNIEnv.g.cs:404
04-18 14:20:36.170 I/mono-stdout(21436): at Android.Runtime.JNIEnv.CallIntMethod (IntPtr jobject, IntPtr jmethod, Android.Runtime.JValue* parms) [0x00064] in /Users/builder/data/lanes/3053/a94a03b5/source/monodroid/src/Mono.Android/src/Runtime/JNIEnv.g.cs:404
at Android.Media.MediaCodec.DequeueOutputBuffer (Android.Media.BufferInfo info, Int64 timeoutUs) [0x00057] in /Users/builder/data/lanes/3053/a94a03b5/source/monodroid/src/Mono.Android/platforms/android-23/src/generated/Android.Media.MediaCodec.cs:1179
at sks_Client.Droid.H264Decoder.Decode (System.Byte[] data, Boolean isKeyframe, Boolean isComplete, Boolean isCodecConfig) [0x000a9] in C:\Projekte\sks-Client for Gateway 2.0\sks-Client for Gateway 2.0\sks_Client_for_Gateway_2._0.Droid\H264Decoder.cs:235
04-18 14:20:36.171 I/mono-stdout(21436): at Android.Media.MediaCodec.DequeueOutputBuffer (Android.Media.BufferInfo info, Int64 timeoutUs) [0x00057] in /Users/builder/data/lanes/3053/a94a03b5/source/monodroid/src/Mono.Android/platforms/android-23/src/generated/Android.Media.MediaCodec.cs:1179


PS: When I do not Change my Code, use my copied RFC6184Media class, but the older assemblies from 88 instead, my Picture is better then the latest 92! I have strong artefacts, but I see something and it has Motion. With the latest Version 92 the bottom of the Picture is missing and less Motion, after short while no Motion at all only startframe.
Apr 19, 2016 at 9:36 AM
Edited Apr 19, 2016 at 9:47 AM
Hello Julius,

after a lot of testing with your latest versions and I started from new to older, which was the wrong direction:

your Version 88 was good. Heavy artefacts, but small delay, could watch the whole stream with artefacts.

89 and every Version after including 92 got no Picture at all or only a small part or some Pixels, and then it freezed completly.

I hope this Information will help you. I will now have a look at differences between 88 and 89 if it is not too much...


So currently using 88 and my copied RFC6184Media class.
Apr 19, 2016 at 10:16 AM
Edited Apr 19, 2016 at 12:57 PM
I now copied your 89 "RTPFrame" class to my 88 Version.

The Compiler complained about a missing class, I found it in "Memorysegment". So I copied this cs file from your 89 to my 88.

I recompiled got two Errors about a missing GetPacketKey in media classes which should have no Impact for me because I am using my own RFC6184Media copied from yours, derived from RTPFrame. So I set -1 just for testing my stuff.

Now I always get 1 Picture, the top Looks okay, the bottom is extremly blocky and it freezes at once and I get the above mentioned Java.Lang.IllegalStateException. After that it freezes.

So what you did in 89 in RTPFrame and Memorysegment kills my stream. You should do the opposide of that Change and then my stream gets even better? :-))

Please have a look at your changes in 89 in These two classes.

If you got time for a TeamViewer session you are very welcome, thanks!


PS: I copied RTPFrame and MemorySegment cs files from 88 to 92. I recompiled 92. Now I get a working Stream again with artefacts. If I use "Final" as true and only work with that, the bottom of mc Picture sometimes freezes (not always). If I do not use Final at all, it does not. Then I only have heavy artefacts Problem left.


A suggestion: in your RFC6184Media.cs (maybe others too) you should think about adding something like this, or else you will get constant exceptions :
                //(May need to handle re-ordering)
                //In such cases this step needs to place the packets into a seperate collection for sorting on DON / TSOFFSET before writing to the buffer.

                if (packet.Payload == null) // Add this or something like this
                    return;
... here packet.Payload is being used


And if I use the RTPFrame directly from OnSourceFramechange without my copied RFC6184Media class, and Depacktize it and if HasDepacktized give Frame.Buffer.ToArray to my Decoder, from the beginning I get These Java IllegalStateExceptions (and no Picture).
Apr 19, 2016 at 12:10 PM
Edited Apr 19, 2016 at 12:24 PM
And one more info:
    public void DecodeFrame(Media.Rtp.RtpFrame frame)
    {
        try
        {
            using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
            {
                if (Globals.DoDetailedLog)
                {
                    ClientLogger.Instance.Log(string.Format("Lowest {0} and highest {1} Sequence Number", frame.LowestSequenceNumber, frame.HighestSequenceNumber));
                    if (profileFrame.IsKeyFrame())
                        ClientLogger.Instance.Log(string.Format("KeyFrame at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                    if (profileFrame.ContainsCodedSlice)
                        ClientLogger.Instance.Log(string.Format("Slice at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                    if (profileFrame.ContainsPictureParameterSet)
                        ClientLogger.Instance.Log(string.Format("PPS at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                    if (profileFrame.ContainsSequenceParameterSet)
                        ClientLogger.Instance.Log(string.Format("SPS at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                    if (profileFrame.ContainsInstantaneousDecoderRefresh)
                        ClientLogger.Instance.Log(string.Format("Refresh at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                    if (profileFrame.ContainsSupplementalEncoderInformation)
                        ClientLogger.Instance.Log(string.Format("EncoderInfo at   Time: {0}:{1}", DateTime.Now.ToString(), DateTime.Now.Millisecond));
                }
I am not using These flags, but I wanted to Count Keyframes and see if I get SPS PPS. None of These are ever set at the Moment, using 92 with RTPFrame and MemorySegment from 88 and my copied RFC6184Media.cs.
Apr 19, 2016 at 12:49 PM
Edited Apr 19, 2016 at 12:58 PM
Hello Julius,

I managed to get my saved videofile working.

Following These steps you can Play it in VLC media Player.
http://www.stardot-tech.com/kb/index.php?View=entry&EntryID=186

It is not a real mkv file, it is my raw h264 stream:
https://www.dropbox.com/s/robi6k0bsaapb6d/ownvideo.mkv?dl=0

I saved it this way (makes a nice raw Videorecorder): Opening a filestream for write then saving each Frame called by OnSourceFramechanged like this:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    profileFrame.Depacketize(); 
                    if (profileFrame.HasDepacketized)
                    {
                        if (Globals.DoWriteFile)
                        {
                            fileStream.Seek(0, System.IO.SeekOrigin.End);
                            fileStream.Write(profileFrame.Buffer.ToArray(), 0, profileFrame.Buffer.ToArray().Length);
                        }
                        Decode(profileFrame.Buffer.ToArray(), false, false, false);
                    }
                }
I think something is going wrong in RFC6184Frame Depacketize (or at least in my case here).
It Shows my Problems. The camera is watching towards the ceiling, where is a lamp. I sometimes wave with the Hand or put a small ring on it... :-)
Apr 19, 2016 at 2:04 PM
Edited Apr 19, 2016 at 2:15 PM
I hope it is not too much info today ;-)

I am still using OnSourceFramechanged, then own RFC6184Frame like directly above and started to debug Depacketize.

I added a log to see my NalUnit Types (maybe it helps you if there is something Special or strange about my stream)
From your Code and compared to RFC6184:
7 is Sequence Parameter, 8 Picture Parameter, 28 is Fragmentation Unit FU-A and 1 is ? a start Sequence / Single NAL unit packet?

I get These values always starting with 7 and 8 which should repeat every second, this works like expected. The rest is mostly 28 and some 1.

7 8 28 28
1 1 7times 28
1 1 7 8 10times 28
7 8 10times 28
........
Coordinator
Apr 19, 2016 at 9:21 PM
Hello,

I am just getting some free time.

I will have some time today and tomorrow but Thursday I will be busy again for a little.

Friday I am free.

Next week, Monday and Tuesday I am busy but Wednesday - Friday I should be free again.

Hopefully we can get you squared away by then.

In your current example you access Buffer multiple times, this makes it difficult to discern your expectations.

When Buffer is created if the packets are out of order the data in the buffer will reflect this, if new packets are added later, especially if they are out of order the data in the Buffer will not be able to be utilized because Depacketize was already called.

To work around this I suggested you use the 'Buffer' e.g. using(var stream = frame.Buffer)...

This will ensure that the Buffer gets updated when new packets are added and it will also ensure that the order of the data is correct when the buffer is re-created again.

I am still working on the state mechanism to discern if a packet was Depacketized or not, obviously the SequenceNumber alone is not enough and obviously the HashCode of the packet can't be used for various reasons.

This implies I need to change the type of collection that Depacketized is to something like a Dictionary but I will also need to play around with the GetHashCode implementation of RtpPacket because as of right now I have a weird implementation of GetHashCode which doesn't change with the SequenceNumber or Timestamp in the header.

I should probably change the GetHashCode implementation to use the SequenceNumber also in the calculation, that would prevent packet headers from having duplicate HashCodes at possibly unintended times.

What I am essentially trying to say is that two RtpPackets with exactly the same data will have different HashCode's right now but their Headers will have the same HashCode.

I will add some logic to demonstrate this when I make an update.

In addition to this, in my current code I currently changed the lookup on Depacketized to use the SequenceNumber but I incorrectly use this value to check if the packet is contained and skip packets which are thought to be contained when they are not.

I will fix this but it's important to understand why this is happening especially if different types of streams are consumed which do require packet level re-ordering outside of the Rtp order.

In some cases a sequence of packets can be logically ordered according to rtp sequence number but contain data which is not ordered in the same way, this is typically called interleaving.

In those cases the depacketization methods are still the same but the underlying NAL's are potentially not in decoding order, thus if your decoder needs to have the nals units in decoding order then they must be re-ordered before giving them to a decoder.

That being said, most decoders can handle out of order nal units and re-order them as required for decoding as they read the stream.

What I can do if I can't end up deciding on how to enforce a storage mechanism which supports my intentions is to revert some of the API back to the way it was until I can.

E.g. move Buffer to the RFC6184Frame like it was so that each time it is depacketized it will have different results which reflect it's contents at the time of the call.

I would also be able to do similar logic with the current API by disposing the Buffer when packets are added out of order but its not efficient to re-create the stream every time, thus why I suggested something like a SegmentStream which can facilitate this type of logic efficiently and would be more inline with the API I was trying to offer.

I will make some updates soon and we can go from there on what is working for you and what is not.
Marked as answer by juliusfriedman on 4/19/2016 at 1:21 PM
Coordinator
Apr 19, 2016 at 10:33 PM
I posted 111996.

It should resolve the duplicate sequence number issue.

Lets go from here and see if you still can't get this working with your decoder.

From here I need to work on making Depacketized more useful and potentially adding the SegmentStream.

Once all that is sorted out I can move the frame classes to their own assembly or within the Rtp assembly and proceed to work on coming up with a way to dynamically select the depacketizer.
Apr 20, 2016 at 8:12 AM
Edited Apr 20, 2016 at 9:40 AM
Hello Julius,

thanks for the update. I copied the new 96 DLLs into my Project which was working with 88 successfully with artefacts.
I copied your RFC6184Media.cs into my Project, derived it from RTPFrame an fixed some compiling Namespace Bugs and removed Start etc void.

It does not work. I get a first Picture with top okay, bottom messed up and freezes at once. Sometimes the first nal packets are inserted in the Decoder without exception, sometimes the Problems start with the ACK.

I am here for the next 8 hours (we have 9 am here). Help would be very welcome. I can't wait for next week, Must get it working today or early tomorrow or our customer Bytes my head off and my Project will die or we get more time but must solve it in a different way with native pjsip or something...

Thanks a lot for your help so far! If you can help me today please let me know.

04-20 09:01:45.045 I/mono-stdout(19862): Via: SIP/2.0/UDP 192.168.1.132:5060;branch=z9hG4bK007ba2db
04-20 09:01:45.045 I/mono-stdout(19862): Max-Forwards: 70
04-20 09:01:45.045 I/mono-stdout(19862): From: sip:sipclient@192.168.1.132;tag=as2dfc290d
04-20 09:01:45.045 I/mono-stdout(19862): To: sip:test@192.168.1.102:47042;tag=284853500
04-20 09:01:45.046 I/mono-stdout(19862): Call-ID: 70c0024434c23c404043c5de5c60980a@192.168.1.132:5060 04-20 09:01:45.046 I/mono-stdout(19862): CSeq: 103 ACK
04-20 09:01:45.046 I/mono-stdout(19862): Contact: sip:sipclient@192.168.1.132:5060
04-20 09:01:45.046 I/mono-stdout(19862): Content-Length: 0
04-20 09:01:45.046 I/mono-stdout(19862): User-Agent: Asterisk PBX GIT-master-60a15fe
04-20 09:01:45.046 I/mono-stdout(19862):
04-20 09:01:45.158 D/Mono (19862): Assembly Ref addref Media.Rtp[0xb9246358] -> Media.Ntp[0xb927f400]: 2
04-20 09:01:45.550 I/MediaCodec(19862): [OMX.qcom.video.decoder.avc] setting surface generation to 20338689
04-20 09:01:45.552 I/ACodec (19862): DRC Mode: Dynamic Buffer Mode
04-20 09:01:45.552 I/ExtendedCodec(19862): Decoder will be in frame by frame mode
04-20 09:01:45.588 D/SurfaceUtils(19862): set up nativeWindow 0xba886f30 for 640x480, color 0x7fa30c04, rotation 0, usage 0x42002900
Resolved pending breakpoint at 'RFC6184Media.cs:394,1' to void sks_Client.Droid.RFC6184Media.RFC6184Frame.ProcessPacket (Media.Rtp.RtpPacket packet, bool ignoreForbiddenZeroBit, bool fullStartCodes) [0x00012].
Resolved pending breakpoint at 'RFC6184Media.cs:412,1' to void sks_Client.Droid.RFC6184Media.RFC6184Frame.ProcessPacket (Media.Rtp.RtpPacket packet, bool ignoreForbiddenZeroBit, bool fullStartCodes) [0x0005e].
04-20 09:01:46.215 I/art (19862): Starting a blocking GC Explicit
04-20 09:01:46.239 I/art (19862): Explicit concurrent mark sweep GC freed 493(87KB) AllocSpace objects, 0(0B) LOS objects, 39% free, 9MB/15MB, paused 300us total 22.246ms
04-20 09:01:46.240 D/Mono (19862): GC_OLD_BRIDGE num-objects 81 num_hash_entries 82 sccs size 82 init 0.00ms df1 0.30ms sort 0.12ms dfs2 0.64ms setup-cb 0.06ms free-data 0.08ms links 2/2/2/1 dfs passes 165/84
04-20 09:01:46.240 D/Mono (19862): GC_MINOR: (Nursery full) pause 17.73ms, total 17.93ms, bridge 0.00ms promoted 1856K major 1856K los 180K
04-20 09:01:46.588 E/ACodec (19862): [OMX.qcom.video.decoder.avc] ERROR(0x80001009)
04-20 09:01:46.588 E/ACodec (19862): signalError(omxError 0x80001009, internalError -2147483648)
04-20 09:01:46.588 E/MediaCodec(19862): Codec reported err 0x80001009, actionCode 0, while in state 6
Exception occured in H264Decoder Decode: Java.Lang.IllegalStateException: Exception of type 'Java.Lang.IllegalStateException' was thrown.
04-20 09:01:46.636 I/mono-stdout(19862): Exception occured in H264Decoder Decode: Java.Lang.IllegalStateException: Exception of type 'Java.Lang.IllegalStateException' was thrown.
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/external/referencesource/mscorlib/system/runtime/exceptionservices/exceptionservicescommon.cs:143
04-20 09:01:46.636 I/mono-stdout(19862): at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/external/referencesource/mscorlib/system/runtime/exceptionservices/exceptionservicescommon.cs:143
at Android.Runtime.JNIEnv.CallIntMethod (IntPtr jobject, IntPtr jmethod, Android.Runtime.JValue* parms) [0x00064] in /Users/builder/data/lanes/3053/a94a03b5/source/monodroid/src/Mono.Android/src/Runtime/JNIEnv.g.cs:404
at Android.Media.MediaCodec.DequeueOutputBuffer (Android.Media.BufferInfo info, Int64 timeoutUs) [0x00057] in /Users/builder/data/lanes/3053/a94a03b5/source/monodroid/src/Mono.Android/platforms/android-23/src/generated/Android.Media.MediaCodec.cs:1179
at sks_Client.Droid.H264Decoder.Decode (System.Byte[] data, Boolean isKeyframe, Boolean isComplete, Boolean isCodecConfig) [0x000a9] in C:\Projekte\sks-Client for Gateway 2.0\sks-Client for Gateway 2.0\sks_Client_for_Gateway_2._0.Droid\H264Decoder.cs:308
Apr 20, 2016 at 8:17 AM
Edited Apr 20, 2016 at 8:42 AM
I am using the buffer only one time. The file was written one time for testing and is now turned off. I leave it in the Code if I need to write another test file in future. But thanks for the hint!

I really need to get this working today or latest tomorrow morning, or I will have to give up this way and loose the Project or Change to a totally different library or whatever, I am getting much pressure from above because it took me too Long to get this far. Of Course improvements would be great and if I get it working so far (not perfect but ok), I have many weeks to improve it, test new versoin while developing other stuff too, but I need to have something I can demonstrate very soon.

If you can help me today with updates or Team Viewer sessons or both, please let me know, I can of course stay longer today if it helps. I'll be here next 8 hours or more, if I hear from you. thanks again :-)
Apr 20, 2016 at 8:40 AM
Edited Apr 20, 2016 at 10:35 AM
What is correct, the comment or the Code? :-)

From RFC6184Media.cs:
            //Must have at least 2 bytes
            if (count <= 2) return;

I just tried to see if the RTPFrame itself delievers something useful. I wrote a file again, called by OnSourceFrameChanged:
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
                frame.Depacketize();
                if (frame.HasDepacketized && frame.Buffer != null) // && !frame.IsMissingPackets
                {
                    if (Globals.DoWriteFile)
                    {
                        fileStream.Seek(0, System.IO.SeekOrigin.End);
                        fileStream.Write(frame.Buffer.ToArray(), 0, frame.Buffer.ToArray().Length);
                        return;
                    }
The file is only 16 kB big for a over 10 seconds call. When I created the other file linked above with the copied RFC6184Media.cs it was over 2 MB and could at least be played.

Next I tried just to write the file exaclty like I did before with the copied RFC6184Media.cs, using it to Depacketize then write the profileFrame.Buffer.ToArray() into the file. The file is about 3 MB. But I cannot Play it. When I do the same with Version 88 I get a playable raw file with artefacts. So the 92 stream cannot be used after Depacketize at the Moment.
Coordinator
Apr 20, 2016 at 12:41 PM
I am free today so I will troubleshoot with you.

If you have TeamViewer send me an email with the details and I'll join.

I don't know what you mean "the comment or the code", if you have better luck with pjsip then just use that.

I still think this has something to do with how and when Decode is called, e.g. if the frame is not complete but we will easily determine what's the root cause today.

Please ensure you have wireshark installed.

Please also have ffmpeg or another decoder ready to test the output from the library.
Marked as answer by juliusfriedman on 4/20/2016 at 4:41 AM
Apr 20, 2016 at 12:48 PM
Hello Julius,

I personally do not want to use PJSIP but I may be forced to, if I cannot deliver a result soon.

Thanks a lot, I am just installing TeamViewer. With the comment above I only wanted to ask you if that maybe an error?
//Must have at least 2 Bytes, that means 2 is already enough?
        if (count < 2) return;
or
//Must have at least 3 bytes
        if (count <= 2) return;
Apr 20, 2016 at 12:54 PM
I still hope I can get it working with your help and your solution, that would be great!

I installed TeamViewer, what do you Need? My ID, or a Meeting ID?
Coordinator
Apr 20, 2016 at 12:58 PM
That comment refers to. Fu packet header.

There are two bytes the FUIndicator and FUHeader, the nal type is reconstructed from those bytes.

It is technically legal to have those bytes and no other data as far as I can from the see but this would usually only be for an end packet anyway...

Hence I allow it and put the start code and reconstructed header, the question for me is "what if there's no data in the payload and the next packet is lost?"

Would the F bit would need to be set to indicate a bitstream error?

I think so but we will see.
Coordinator
Apr 20, 2016 at 12:59 PM
Meeting Id and passcode.

Email it to me so that we don't have unintended access.

Do you also have skype?
Marked as answer by juliusfriedman on 4/20/2016 at 4:59 AM
Apr 20, 2016 at 1:00 PM
I will install skype. Maybe this will be easier if we can talk. Thanks and wait a second.
Coordinator
Apr 20, 2016 at 1:03 PM
Awesome, I am getting my coffee ready so take your time.
Marked as answer by juliusfriedman on 4/20/2016 at 5:05 AM
Apr 20, 2016 at 1:07 PM
Mail sent to you via this site. Thanks!
Coordinator
Apr 20, 2016 at 9:56 PM
I have added a way to copy the Depacketized data to a plain byte[] with 111998.

I have also done more work on the SegmentStream but it's not yet complete.

I have also made changes to Buffer so that it's not disposed and re-created every time the member is accessed.

When you access Buffer one time the MemoryStream it still is allocated and copied from the underlying data so in short try not to even use the 'Buffer' property of the RtpFrame if your trying to avoid putting memory pressure on the Garbage collector.

Instead, Create a byte[] somewhere in your application class and use this byte array to feed the decoder.

Use CopyTo to take the data which has been depacketized out of memory and copy it to the byte[] you want to give the decoder at some offset (initially 0)

Move the offset by the amount CopyTo returns and repeat until you run out of room in the application buffer.

The only way it could get anymore efficient than that is to make a Depacketize / Process packet overload which puts the data directly into an array when depacketizing and doesn't even have a Depacketized member.

If I / you do that we may as well just remove the Depacketized member all together as it would never be used, one would just use the array of data which was maintained by the Depacketize call.

Since it doesn't make sense for a RtpFrame to contain an additional array what I will probably end up doing is to just revert some of my design changes and keep the Buffer and Depacketized members in the derived classes where they are useful.

I updated the issue @ https://net7mma.codeplex.com/workitem/17339 to explain this.

Let me know if I can help you any further!
Marked as answer by juliusfriedman on 4/20/2016 at 1:56 PM
Apr 21, 2016 at 8:04 AM
Edited Apr 21, 2016 at 9:14 AM
Hello Julius,

I downloaded 98 and recompiled it and copied your assemblies to my Project, I copied your RFC6184Media to my Project and fixed the Compiler Errors.

One Thing: Yesterday we used using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame, true, true))
that does not work anymore. I removed the two Parameters again, I don't know if we still Need them.

Testing with our Code from yesterday or the older Decode method before in both cases I got one Picture that freezes completly.
Instead, Create a byte[] somewhere in your application class and use this byte array to feed the decoder.
How do I feed it?

From the issue:
Added Depacketized property for keeping data in memory and sorting.
I do not find that one. I guess this one would be used to feed my Decoder?

I cannot find this one. Maybe you forgot to update something?

I went back using 88 assemblies. Works. Will work on Settings etc to see if I can decrease the delay this way.

With 88 both the new and the old decode method work. without a Change I would recognize.
Even with a halfed Bitrate or Bitrate set to 0 and Quantization Quality set to a middle value the delay is still high. It starts with around 3 Seconds maybe but grows up to 10 and more with a good Picture...

May I should throw away some Complete Frames when the time-difference gets too high or the buffer grows up too large?
Apr 21, 2016 at 10:26 AM
Edited Apr 21, 2016 at 10:54 AM
I tried to drop some Frames to see if the delay gets samller then (still using 88. although I am using the buffer only one time I cannot run 98 and cannot find the new property there for not using the buffer).
        public void DecodeFrame(Media.Rtp.RtpFrame frame)
        {
            try
            {
             using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    //if (!profileFrame.IsKeyFrame())
                        //return;
                    //dropCounter++;
                    //if (dropCounter < 3)
                    //    return;
                    //else
                    //    dropCounter = 0;

                    profileFrame.Depacketize(); //false true
                    if (profileFrame.HasDepacketized)
                    {
                        byte[] buffer = profileFrame.Buffer.ToArray();
                        DecodeOld(buffer, 0, buffer.Length, false, false, false);
                    }
                    return;
Just to let you know:
If I use if (!profileFrame.IsKeyFrame()) return; or the same with !profileFrame.IsComplete I get no Picture at all.
If I do not return I get a very good Picture with a groing delay (from 4 growing over 10). If I use that dropCounter Hack the delay gets smaller and is never above 4 sec but produces some artefacts again.
PS: With 92 also freezes after first Frame. Buffer only used 1 time.
Apr 21, 2016 at 1:00 PM
PS: Just before my internal presentation I found out, with Setting the Bitrate on the Camera / Gateway to 0 and then Setting Quantization Quality Parameter to 48 for example, the delay is completly gone. The Picture is not good, Looks a bit fragmented but no artefacts.
When I set Quant to 25, the Picture Looks good again but I have a strong delay. My presentation was okay, thanks for your help!

So I think if the buffer / allocation Problem was a bit better, I would be great!
Coordinator
Apr 27, 2016 at 8:10 PM
Sorry for the delay.

Undertstood about the allocations, this library doesn't allocate that memory which is causing you the problem, MemoryStream does and I am attempting to work around that by creating the SegmentStream.

Until it's done you can use CopyTo which basically does the same thing but required that you have a buffer already allocated to store the data in for decoding.

I will have more updates later today or tomorrow which should include unit tests for SegmentStream.
Marked as answer by juliusfriedman on 4/27/2016 at 12:10 PM
Apr 29, 2016 at 1:22 PM
Hello Julius,
//get { return HasBuffer ? m_Buffer : m_Buffer = new Common.SegmentStream(Depacketized.Values); }
This did not work. It cannot be casted to a Memorysream (the Segmentstream).
I just rebuild the 112013 and replaced my assemblies which were still from the 88.
I get one Picture again which freezed and the get the IllegalStateExceptions.

I am still using
                using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
                {
                    profileFrame.Depacketize(); //false true
                    if (profileFrame.HasDepacketized)
                    {
                        if (VideoFrameReceived != null)
                        {
                            VideoFrameReceived(this, profileFrame.Buffer.ToArray());
and just starting to examine this issue. thanks :-)
Coordinator
Apr 29, 2016 at 1:29 PM
You would have to change the property type to either stream or segment stream as well for the field and the property.

I don't see how there is possibly any difference in operation because I haven't made use of the new stream within the RtpFrame class yet which means it's the same as it always was.

Let me know what you find.
Marked as answer by juliusfriedman on 4/29/2016 at 5:29 AM
Apr 29, 2016 at 1:41 PM
Edited Apr 29, 2016 at 2:17 PM
You would have to change the property type to either stream or segment stream as well for the field and the property.
Thanks and Sorry, which field or property?

If you talk about the Get in the RTPFrame I am having the downloaded Version again.
Coordinator
Apr 29, 2016 at 2:22 PM
'm_Buffer' is the field, right now which is the backing for the property 'Buffer' that field is used in PrepareBuffer.

Since PrepareBuffer is only called when accessing the Buffer property on the RFC6184Frame class you can just worry about the Buffer property.

Since the type of the property is 'MemoryStream' this needs to be changed, Stream is sufficient but beware, when calling CopyTo there is an intermediate allocation buffer used to transfer the data which has a default size of 81920 bytes.

See the reference source which has a note about this

SegmentStream provides a CopyTo overload for arrays and a CopyToStream method for streams which does not use an intermediate buffer.

This is also true for ReadByte, WriteByte and Read / Write operations as well, which unfortunately use intermediate buffers in the Stream implementation, SegmentStream does not.

You can verify this by running the UnitTest for the SegmentStream a few times, I use the MemoryStream for comparison of their ToArray and CopyTo methods, you will see allocations related to the MemoryStream class will cause the GC to need to run much more than the SegmentStream.

If you specify the capacity or the buffer of the MemoryStream manually this is much better.

Anyway, if you look at the code responsible for the property 'Buffer'
//get { return HasBuffer ? m_Buffer : m_Buffer = new Common.SegmentStream(Depacketized.Values); }
Since HasBuffer will always return false because the 'm_Buffer' field has not been created by PrepareBuffer you can quite easily use the code as is if you simply change the property type of the field to what you want it to be.
public System.IO.Stream / Common.SegmentStream
        {
           get { return HasBuffer ? m_Buffer : m_Buffer = new Common.SegmentStream(Depacketized.Values); }           
        }
After a access of Buffer, the call HasBuffer will work as expected so future calls will return the same CLR reference to the SegmentStream unless it has been Disposed.

I will be integrating the SegmentStream into the RtpFrame class soon enough, I am just working on the storage mechanism which will enable for adding packets and packetizing and depacketizing in a single collection while still allowing for fast insert and remove.

The decoder will still need an array in your case which is why I suggested a RecycleableStream, you would be able to reserve a single large buffer and reuse it for decoding throughout the application.

I personally think that if you need an array than the Facade to Stream is not helpful in your situation, you should just keep an array around in the application and provide it to be used by a Stream when necessary if ever.

You could also abstract the decoder process slightly more than you do now to keep things on different threads.

Depending on how the decoder API is you should be able to Queue a buffer and keep track of where you put it, this should NOT be happening on the RtpClient's thread through the event.

For any packet or frame which arrives you should be copying the data to your application buffer for use, when FrameChange events are used the Packets are automatically copied for you and put into RtpFrame.

This frame is given to you multiple times via FrameChanged when packets are added to it, until final is true.

The frame may be complete before final is true but that is up to the application to handle.

When final is true you must either prevent the Dispose call of the frame by using 'Common.BaseDisposable.SetShouldDispose(frame, false)'

AND PROCESS THIS FRAME OUTSIDE OF THIS EVENT HANDLER, e.g. make the call then add it to a Queue which is processed in a separate thread, after processing is complete use 'Common.BaseDisposable.SetShouldDispose(frame, true, true)' to signal the GC to collect the resources used by the instance.

When Decode is called for the Decoder you should ensure that the buffers processed by Decode are only up to the point in which completed frames have Queued data, this will allow you to not process any data in Decode if there are incomplete frames or if you have many completed frames and you only want to display a single frame.

OR, if you don't want to keep the frames around longer....

Call Depacketize and Use one of the methods depending on what you need CopyTo / Buffer / etc to get the data out of the Depacketized member.

The problem with this design is that you still have to feed your decoder, unfortunately for your design the Decoder is using the same thread as the RtpClient which is blocked when Decode is called.

You can use a Background worker obviously but the Frame may already be disposed by the time it needs to be processed, so how do you work around that? (Without keeping track of the frames you want to decode yourself)

You would put the Decoder in a separate process, that process would have StandardInput. You would CopyTo the SegmentStream of the Frame to the the Stream which is the StandardInput for the Decoder from your application directly in the event handler.

The Decoder process would read it's StandardInput stream and handle it as necessary, if it needs to maintain a framerate / etc that is done within the Decoder thread.

Let me know if any of that doesn't make sense.
Marked as answer by juliusfriedman on 4/29/2016 at 6:22 AM
Apr 29, 2016 at 2:37 PM
Hi Julius,

in order to test which one could be better, I do Need to get it working again.

I am still stuck on the 88 with my above posted Code used in OnFramechanged.

When I just Change the assemblies I only get one single Frame then freezes.

And a quick test like changing the buffer to the CopyTo for above Code like this:
                        //VideoFrameReceived(this, profileFrame.Buffer.ToArray());
                        byte[] buffer = new byte[512000];
                        profileFrame.CopyTo(buffer, 0);
                        VideoFrameReceived(this, buffer);
also Shows one single Frame then freezes. And the Output Shows that IllegalStateExceptions from the Decoder only....
Apr 29, 2016 at 2:54 PM
I still believe there is something wrong after 88, no matter how I try.
It should not be the RFC6184Frame, which I have in my Project from the latest Version.
could be the RTPFrame itself or the SegmentedStream or new Memorystream or something else that changed since 88...

Thanks a lot for your help and have a nice weekend!
Coordinator
Apr 29, 2016 at 3:07 PM
RtpFrame doesn't yet use the SegmentStream...

CopyTo of the RtpFrame is an independent call which just copies any depacketized data to the given array for when streams are not used.

MemoryStream is not my class, its a BCL class.

No problem, I will hopefully have another update later today.
Marked as answer by juliusfriedman on 4/29/2016 at 7:07 AM
Coordinator
Apr 29, 2016 at 3:12 PM
I just put up 112015, it should have fixed any issue with CopyTo.
Marked as answer by juliusfriedman on 4/29/2016 at 7:13 AM
May 2, 2016 at 8:23 AM
Hello Julius,

I upgraded to 112018.

When I now get here:
RTPClient = Media.Rtp.RtpClient.FromSessionDescription(sessionDescription, rtpPort: 0, rtcpEnabled: false);

it gets me this exception:

Problem occured during connection RTP: System.TypeInitializationException: The type initializer for 'Media.Common.Binary' threw an exception. ---> System.NotImplementedException: The method or operation is not implemented.
05-02 09:16:47.490 I/mono-stdout(24792): Problem occured during connection RTP: System.TypeInitializationException: The type initializer for 'Media.Common.Binary' threw an exception. ---> System.NotImplementedException: The method or operation is not implemented.
at System.Runtime.InteropServices.Marshal.ReadByte (System.Object ptr, Int32 ofs) [0x00000] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System.Runtime.InteropServices/Marshal.cs:781
at Media.Common.Binary..cctor () [0x000fc] in c:\Projekte
et7mma-112018\Common\Classes\Binary\Binary.cs:1038
--- End of inner exception stack trace ---
at Media.RFC3550.Random32 (Int32 type) [0x00000] in c:\Projekte
et7mma-112018\Rtp\RFC3550.cs:88
at Media.Rtp.RtpClient+TransportContext.FromMediaDescription (Media.Sdp.SessionDescription sessionDescription, Byte dataChannel, Byte controlChannel, Media.Sdp.MediaDescription mediaDescription, Boolean rtcpEnabled, Int32 remoteSsrc, Int32 minimumSequentialpackets, System.Net.IPAddress localIp, System.Net.IPAddress remoteIp, Nullable1 rtpPort, Nullable1 rtcpPort, Boolean connect, System.Net.Sockets.Socket existingSocket, System.Action1 configure) [0x000f7] in c:\Projekte
et7mma-112018\Rtp\RtpClient.cs:408
at Media.Rtp.RtpClient.FromSessionDescription (Media.Sdp.SessionDescription sessionDescription, Media.Common.MemorySegment sharedMemory, Boolean incomingEvents, Boolean rtcpEnabled, System.Net.Sockets.Socket existingSocket, Nullable
1 rtpPort, Nullable1 rtcpPort, Int32 remoteSsrc, Int32 minimumSequentialRtpPackets, Boolean connect, System.Action1 configure) [0x0005a] in c:\Projekte
et7mma-112018\Rtp\RtpClient.cs:264
at sks_Client.Manager.RTPClientManager.InitSessionRTP (Independentsoft.Sip.Sdp.SessionDescription session, System.Action1 playVideoAction) [0x00081] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:120
05-02 09:16:47.491 I/mono-stdout(24792): at System.Runtime.InteropServices.Marshal.ReadByte (System.Object ptr, Int32 ofs) [0x00000] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System.Runtime.InteropServices/Marshal.cs:781
05-02 09:16:47.491 I/mono-stdout(24792): at Media.Common.Binary..cctor () [0x000fc] in c:\Projekte
et7mma-112018\Common\Classes\Binary\Binary.cs:1038
05-02 09:16:47.491 I/mono-stdout(24792): --- End of inner exception stack trace ---
05-02 09:16:47.492 I/mono-stdout(24792): at Media.RFC3550.Random32 (Int32 type) [0x00000] in c:\Projekte
et7mma-112018\Rtp\RFC3550.cs:88
05-02 09:16:47.492 I/mono-stdout(24792): at Media.Rtp.RtpClient+TransportContext.FromMediaDescription (Media.Sdp.SessionDescription sessionDescription, Byte dataChannel, Byte controlChannel, Media.Sdp.MediaDescription mediaDescription, Boolean rtcpEnabled, Int32 remoteSsrc, Int32 minimumSequentialpackets, System.Net.IPAddress localIp, System.Net.IPAddress remoteIp, Nullable
1 rtpPort, Nullable1 rtcpPort, Boolean connect, System.Net.Sockets.Socket existingSocket, System.Action1 configure) [0x000f7] in c:\Projekte
et7mma-112018\Rtp\RtpClient.cs:408
05-02 09:16:47.493 I/mono-stdout(24792): at Media.Rtp.RtpClient.FromSessionDescription (Media.Sdp.SessionDescription sessionDescription, Media.Common.MemorySegment sharedMemory, Boolean incomingEvents, Boolean rtcpEnabled, System.Net.Sockets.Socket existingSocket, Nullable1 rtpPort, Nullable1 rtcpPort, Int32 remoteSsrc, Int32 minimumSequentialRtpPackets, Boolean connect, System.Action1 configure) [0x0005a] in c:\Projekte
et7mma-112018\Rtp\RtpClient.cs:264
05-02 09:16:47.493 I/mono-stdout(24792): at sks_Client.Manager.RTPClientManager.InitSessionRTP (Independentsoft.Sip.Sdp.SessionDescription session, System.Action
1 playVideoAction) [0x00081] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:120
Thread finished: <Thread Pool> #3
The thread '<Thread Pool>' (0x3) has exited with code 0 (0x0).
Thread finished: <Thread Pool> #5
05-02 09:17:22.574 D/Mono (24792): [0x9d308930] worker finishing
The thread '<Thread Pool>' (0x5) has exited with code 0 (0x0).
Thread finished: <Thread Pool> #4
05-02 09:17:36.793 D/Mono (24792): [0x9d409930] worker finishing
The thread '<Thread Pool>' (0x4) has exited with code 0 (0x0).

Could you please have a look?
May 2, 2016 at 8:31 AM
Edited May 2, 2016 at 10:25 AM
I don't think Xamarin can handle Operations like this: (from binary.cs line 1038):
                byte atOffset = System.Runtime.InteropServices.Marshal.ReadByte(Binary.SedecimBitSize, offset);// memoryOf[offset];
Can you please have a look? I overwrote the Binary.cs with an older Version and tried again. The RTPClient is starting again, but again I get one freezed Picture and then Illegal state exceptions from the mediacodec (Using OnFrameChanged with both RFC6184Media.RFC6184Frame profileFrame profileFrame.Buffer.ToArray() OR profileFrame.CopyTo(buffer, 0);).

I am still stucked on Version 88! If it helps, we can make a skype session? I just want to get a newer Version running, instead of the old one. It would be great to see if the delay gets better with a new one...

Thanks a lot!

Firlefanz
Coordinator
May 4, 2016 at 2:06 PM
No?

https://github.com/mono/mono/blob/master/mcs/class/System/System.Net.Sockets/Socket.cs#L1862

It seems mono uses that method extensively...

Are you sure this isn't a trust issue?

I have just updated the code.

Let me know if you need to setup a session for troubleshooting.
Marked as answer by juliusfriedman on 5/4/2016 at 6:06 AM
May 4, 2016 at 3:33 PM
Edited May 4, 2016 at 3:39 PM
Hello Julius,

thanks a lot for the new Version!

I just downloaded it. I am not sure if the above mentioned Problem is a Problem still. I will have a look... thanks for the hint.

I just downloaded the new Version 112034.

When I start the Video call, I get the following exception (I added 3 lines to the RTPClient.cs, maybe the lines are 2 or 3 lines below there):
05-04 16:20:53.535 I/ExtendedCodec(31412): Decoder will be in frame by frame mode
05-04 16:20:53.561 D/SurfaceUtils(31412): set up nativeWindow 0xba6b2658 for 720x576, color 0x7fa30c04, rotation 0, usage 0x42002900
Exception in DecodeFrame: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
at System.Array.Copy (System.Array sourceArray, Int32 sourceIndex, System.Array destinationArray, Int32 destinationIndex, Int32 length) [0x000e2] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System/Array.cs:961
at Media.Common.SegmentStream.CopyTo (System.Byte[] destination, Int32 offset) [0x00021] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:266
at Media.Common.SegmentStream.ToArray () [0x00017] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:294
at (wrapper remoting-invoke-with-check) Media.Common.SegmentStream:ToArray ()
at sks_Client.Manager.RTPClientManager.DecodeFrame (Media.Rtp.RtpFrame frame) [0x0002f] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:351
[0:] Exception in DecodeFrame: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
at System.Array.Copy (System.Array sourceArray, Int32 sourceIndex, System.Array destinationArray, Int32 destinationIndex, Int32 length) [0x000e2] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System/Array.cs:961
at Media.Common.SegmentStream.CopyTo (System.Byte[] destination, Int32 offset) [0x00021] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:266
at Media.Common.SegmentStream.ToArray () [0x00017] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:294
at (wrapper remoting-invoke-with-check) Media.Common.SegmentStream:ToArray ()
at sks_Client.Manager.RTPClientManager.DecodeFrame (Media.Rtp.RtpFrame frame) [0x0002f] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:351
05-04 16:20:53.579 I/mono-stdout(31412): Exception in DecodeFrame: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
05-04 16:20:53.579 I/mono-stdout(31412): at System.Array.Copy (System.Array sourceArray, Int32 sourceIndex, System.Array destinationArray, Int32 destinationIndex, Int32 length) [0x000e2] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System/Array.cs:961
05-04 16:20:53.580 I/mono-stdout(31412): at Media.Common.SegmentStream.CopyTo (System.Byte[] destination, Int32 offset) [0x00021] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:266
05-04 16:20:53.580 I/mono-stdout(31412): at Media.Common.SegmentStream.ToArray () [0x00017] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:294
05-04 16:20:53.580 I/mono-stdout(31412): at (wrapper remoting-invoke-with-check) Media.Common.SegmentStream:ToArray ()
05-04 16:20:53.581 I/mono-stdout(31412): at sks_Client.Manager.RTPClientManager.DecodeFrame (Media.Rtp.RtpFrame frame) [0x0002f] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:351
Thread started: #9
Exception in DecodeFrame: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
at System.Array.Copy (System.Array sourceArray, Int32 sourceIndex, System.Array destinationArray, Int32 destinationIndex, Int32 length) [0x000e2] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System/Array.cs:961
at Media.Common.SegmentStream.CopyTo (System.Byte[] destination, Int32 offset) [0x00021] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:266
at Media.Common.SegmentStream.ToArray () [0x00017] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:294
at (wrapper remoting-invoke-with-check) Media.Common.SegmentStream:ToArray ()
at sks_Client.Manager.RTPClientManager.DecodeFrame (Media.Rtp.RtpFrame frame) [0x0002f] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:351
[0:] Exception in DecodeFrame: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
at System.Array.Copy (System.Array sourceArray, Int32 sourceIndex, System.Array destinationArray, Int32 destinationIndex, Int32 length) [0x000e2] in /Users/builder/data/lanes/3053/a94a03b5/source/mono/mcs/class/corlib/System/Array.cs:961
at Media.Common.SegmentStream.CopyTo (System.Byte[] destination, Int32 offset) [0x00021] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:266
at Media.Common.SegmentStream.ToArray () [0x00017] in c:\Projekte
et7mma-112023\Common\Classes\SegmentStream.cs:294
at (wrapper remoting-invoke-with-check) Media.Common.SegmentStream:ToArray ()
at sks_Client.Manager.RTPClientManager.DecodeFrame (Media.Rtp.RtpFrame frame) [0x0002f] in C:\Projekte\sks-Kinkel\Repos\Apps Gateway 2.0\sks-Client\sks-Client\sks_Client\Manager\RTPClientManager.cs:351
referenceTable GDEF length=814 1
referenceTable GSUB length=11364 1
referenceTable GPOS length=47302 1
referenceTable GDEF length=808 1
referenceTable GSUB length=11364 1
referenceTable GPOS length=49128 1
referenceTable head length=54 1
Debugger Connection Lost: Debugger lost connection to the running application. Likely this means the application terminated unexpectedly.
I am still using OnFrameChanged with my Code:

Check Playloadtype and final, then
            using (RFC6184Media.RFC6184Frame profileFrame = new RFC6184Media.RFC6184Frame(frame))
            {
                profileFrame.Depacketize(); //false true
                if (profileFrame.HasDepacketized)
                {
then Decode with profileFrame.Buffer.ToArray()

Thanks a lot and have a nice day!
Coordinator
May 4, 2016 at 3:44 PM
That probably means I have a bug...

It would be easier to step through and debug this unless you can make a quick reproducible unit test.

Is this happening because new packets are being added to the frame on a different thread at the same time as a copy?

It shouldn't be, but I will check that method and post back shortly.

Any other information such as what the offset and length of the segment being read are and what the expected length of the array being copied is may help to reveal the problem.
Marked as answer by juliusfriedman on 5/4/2016 at 7:44 AM
Coordinator
May 4, 2016 at 4:01 PM
112025 should fix this in two ways, it provides a length atomic to the time of the call which is copied by value into the function which does the copying. Copying is now Min with the count given to prevent such problems with length not being exactly equal.

Let me know if this still persists.
Marked as answer by juliusfriedman on 5/4/2016 at 8:20 AM
May 6, 2016 at 8:46 AM
Edited May 6, 2016 at 11:08 AM
Hello Julius,

thanks a lot for the update! 112026 is now working on my Android device, and I think the Performance is better, I Need to do some testing still.
That's really good News! Thanks again :-)

PS:
On Android the Performance is better. When moving fast, artefacts appear but only for a short while.

But it is not working on my iOS device. All the changes I made in 88 I re-did in the rtpclient of 112026.

Still not working there. I uploaded my changed RTPClient so you can have a look at it if you like to:

https://www.dropbox.com/s/gncnjesxe85d778/RtpClient.cs?dl=0
You can quick-find all changes made especially for IOS if you search it for "beh"

Are there other new changes like using SetsocketOptions, SendbufferSize, ReceiveBufferSize somewhere else?
Or maybe it are all the System.Runtime.InteropServices.Marshal.ReadByte Things?

I cannot explore it too deep today, I am early heading home because I Need to see the Dentist (aaargh!)

Have a nice Weekend!
Coordinator
May 6, 2016 at 12:51 PM
Good to hear.

I actually saw the dentist yesterday, so good luck with that.

I will check into the IOS stuff today if possible.

If your not building with NATIVE then those calls are omitted, I will see about including some try / catch for IOS possibly in the configure socket method.

Ios doesn't apparently let you set the send buffer size / recieve buffer size or a few other things for some dumb reason, well it does but not through Xamarin...


Have a good weekend also!
Marked as answer by juliusfriedman on 5/6/2016 at 4:51 AM
May 10, 2016 at 2:17 PM
Edited May 10, 2016 at 2:18 PM
Hello Julius,

I just tried both Audio and Video together with Android and rtp Version 112027.

Only Video works good, small delay, when animations appear Little artefacts for a short time.

When doing both Audio and Video, the artefacts get extremly heavy again.
Is it missing Hardware acceleration maybe or do I Need to Change something else in my Client, do you have any idea?
        internal void OnSourceFrameChanged(object sender, RtpFrame frame = null, RtpClient.TransportContext tc = null, bool final = false)
        {
            if (frame != null)
            {
                if ((frame.PayloadType == Globals.H264PlayloadType) && final)
                {

                    DecodeFrame(frame); // Feed Video Decoder which uses RFC6184Media.RFC6184Frame profileFrame with Depacketize then buffer.ToArray still
                }
                else if (tc.MediaDescription.MediaType == MediaType.audio)
                    frame.Depacketize();
                    if (frame.HasDepacketized)
                    {
                        if (AudioDataReceived != null)
                        {
                            byte[] data = frame.Buffer.ToArray();
                            AudioDataReceived(this, new DataEventArgs() { Data = data }); // Feed Audio decoder
                        }
                    }                   
        }
Any idea what I Need to do to enhance both playing Video and Audio together?

Thanks a lot!
Coordinator
May 10, 2016 at 2:44 PM
Edited May 10, 2016 at 2:54 PM
Glad you have it working for video, not sure why Audio would not be working especially if video is...

The artifacts are related to dropping packets, it's important to understand that when you are handling events for frames or packets that you are doing so in the thread of the RtpClient Worker Thread which is also responsible for Send and Receive operations.

This means when you handle those events you want to either immediately defer to a thread / background worker or use BeginInvoke to offload processing from the Worker Thread to ensure you do not miss packets.

You can also increase the ReceiveBufferSize on the socket to allow more time processing in the events at the expense of using more memory.

I have played with threading the events dispatched such that they occur on a separate thread but the result is that you just end up delaying events when processing packets or frames and it also takes more memory.

It's quite easy to do the same thing yourself by calling:
// Queue<RtpFrame> finalFrames = new Queue<RtpFrame>();

Common.BaseDisposable.SetShouldDispose(frame, false);

//frame now can be placed in a collection for processing

finalFrames.Enqueue(frame);
In a background worker you would process those frames for display or skip them to increase frame-rate.

In a future version I may integrate this threading automatically as it helps performance but as I stated it also costs more memory which should be up to the application to determine if it needs to use this approach which is the same as simply defining someplace to put the packets in the application for processing.

If you see the unit test for RtspClient you will see that it consumes all streams available from a source and processes data directly in the handler without missing any packets, the difference is that processing is not intense and doesn't cost much time. Decoding is much more intense and costs more memory and cpu thus why you are missing packets, Decode is a blocking operations and when the decoder is done decoding you have already missed some packets.

This is why you cannot call Decode on the Receive thread, because is it also the same thread which dispatches events for processing.

Please let me know if you understand that.

Also, keep in mind that you will also have to employ some type of synchronization for Audio / Video playback to keep both streams in sync for playback, this will involve keeping the two decoders synchronized in what they are decoding, you can do this by ensuring the Timestamps of the RtpFrames you are giving the decoders correspond to the same playback time for the clock rate of the stream, e.g. don't give too much data to the decoder to process at one time as it will cause skew in playback especially if you miss packets from one stream and not the other.
Marked as answer by juliusfriedman on 5/10/2016 at 6:45 AM
May 12, 2016 at 7:33 AM
Edited May 12, 2016 at 7:33 AM
Hello Julius,

thanks a lot for your hints. We added a Queue now which collects Frames and uses them in an own thread. It is much better again, on Android nearly good.
Still the voice Quality could be better. But Video Quality is okay for medium Resolution now. We did not sync them yet.

Also still working on the IOS where Video is okay and voice can be heared but bad Quality, I guess it is a Decoder / Playback Problem there.

Later (not this week maybe next) we Need to talk in the other direction, too, send rtp packet or Frame with our voice (not Video) to the Gateway. :-)
Coordinator
May 13, 2016 at 7:01 PM
Edited May 13, 2016 at 7:03 PM
Great!

I am going to make another update today which should also improve performance further.

Unfortunately I didn't get around to RtpFrame as much as I wanted but I did get quite a few other things done.

Lets start a new thread for anything else as this thread is long.

Let me know what 'issues' are still present and we can go from there with the version I will release today.

Please also don't forget feedback.

P.s. I added some notes in RtpFrame about how it can be very efficient to send packetized data using only a single RtpHeader and the data which already exists from the decoder.

We can work on that when you start sending!
Marked as answer by juliusfriedman on 5/13/2016 at 11:01 AM