Marker packet

Topics: Question
Apr 8, 2016 at 12:21 PM
Hi,

We are trying to get a RTSP stream from a H264 camera. We are able to start the streaming, but soon get a exception in the Media.rtp.RTPFrame an InvalidOerationException namely "Cannot have more than one marker packet in the same RtpFrame."

In the network stream we can see that the camera does send multiple Marker packets directly after each other.

Our questions are:
  • Is this typical behaviour, that there can be Multiple Marker Packets?
  • Could this be relating to the Audio Stream that we also receive from the camera?
  • Is there a property on the RTPFrame calls that influences this behaviour, to either allow the Multiple Markers or to fire the FrameChanged earlier?
Can you please assist?

Thanks in advance.

https://dl.dropboxusercontent.com/u/62550666/h264_1.png

https://dl.dropboxusercontent.com/u/62550666/h264_2.png
Coordinator
Apr 8, 2016 at 12:50 PM
If you take a look at the RFC Here It says...
@ Section 5.1. RTP Header Usage
Says Marker bit (M): 1 bit
  Set for the very last packet of the access unit indicated by the
  RTP timestamp, in line with the normal use of the M bit in video
  formats, to allow an efficient playout buffer handling.  For
  aggregation packets (STAP and MTAP), the marker bit in the RTP
  header MUST be set to the value that the marker bit of the last
  NAL unit of the aggregation packet would have been if it were
  transported in its own RTP packet.  Decoders MAY use this bit as
  an early indication of the last packet of an access unit but MUST
  NOT rely on this property.

     Informative note: Only one M bit is associated with an
     aggregation packet carrying multiple NAL units.  Thus, if a
     gateway has re-packetized an aggregation packet into several
     packets, it cannot reliably set the M bit of those packets.
In your example it's hard to say if that camera is incorrectly setting the marker or if the Timestamp is incorrectly set because I can't see the individual NAL units, it's also possible that the PayloadType of the frames / packets are different with the same Timestamp, I simply can't tell by the picture.

This could be specifically allowed for in the RFC6184 class itself via additional marker properties and or by scanning the contained nal types to verify them, usually or typically I see the SPS and PPS out of band with a different PayloadType than that of the main stream (sometimes indicated in the MediaDescription or not), when the SPS and PPS are in band in the same stream and use the same PayloadType or when a configuration is shared for multiple streams then it's possible that the inclusion of multiple Marker packets is legitimate but would be more typically found in RFC6190 streams AFAIK.

I am pretty sure on most cameras this is configurable on the camera in some aspect of the configuration of the camera itself, play with some options until you see what effects this output. (Check for Bitrate, Profile, Packetization Mode / SPS, PSP, GOP / Keyframe settings).

Let me know if you need anything else and thanks for bringing this up!

If you could leave a small Wireshark capture of the SDP and a few RTP Frames that exhibit this behavior I will do my best to include for it when developing further and possibly allow for it to be handled better if possible.
Marked as answer by juliusfriedman on 4/8/2016 at 5:50 AM
Apr 8, 2016 at 8:23 PM
Hi Julius,

Thanks for the reply, we have found later that the camera is in fact sending three tracks and by default the RTSP client is connecting to all three. One of them is a ONVIF as you can see below.

We will look into how to limit this to not connect to the third track, but seeing that this is the default behavior. Is the RTPFrame handeling these three "streams" isnt this why we get the Multiple Marker Packet exception.


v=0
o=- 1459500970749629 1 IN IP4 0.0.0.0
s=Session streamed by "nessyMediaServer"
i=h264
t=0 0
a=tool:Streaming Media v2010.04.09
a=type:broadcast
a=control:*
a=range:npt=0-
a=x-qt-text-nam:Session streamed by "nessyMediaServer"
a=x-qt-text-inf:h264
m=video 0 RTP/AVP 99
c=IN IP4 0.0.0.0
a=rtpmap:99 H264/90000
a=fmtp:99 packetization-mode=1;profile-level-id=4D6028; sprop-parameter-sets=J01gKI1oB4AiflwFsgAAAwACAAADADIeKEVA,KO4ESSA=
a=control:track1
a=cliprect:0,0,1080,1920
a=framerate:25.000000
a=x-bufferdelay:0
m=audio 7878 RTP/AVP 0
a=rtpmap:0 PCMU/8000/1
a=control:track2
m=application 0 RTP/AVP 107
a=rtpmap:107 vnd.onvif.metadata/90000/
a=control:track3

Thanks for the info above.
Coordinator
Apr 11, 2016 at 2:02 PM
Edited Apr 11, 2016 at 2:05 PM
No problem, thanks for the SDP, if you could export a few frames of the RTP and RTCP it would also help me ensure that this type of scenario is more easily handled in the future.

I probably should allow multiple markers by default in the RtpFrame and if not I should at least log the failed add.

To specify a certain type of track in the RtspClient it's easy, there is already a parameter which allows only certain types of given tracks to be setup via 'StartPlaying'

I should also probably more easily allow a StopPlaying of only certain media types also.

Just as a FYI it also should be fine to receive that data stream unless you really don't want it for whatever reason, for instance 'text' and just about any other type of stream is setup by default so long as it is in the SessionDescription, I should also probably allow for similar filtering in RtpClient.FromSessionDescription.

Thanks again for bringing this up!

I will be addressing most of this in the next update.
Marked as answer by juliusfriedman on 4/11/2016 at 7:02 AM
Coordinator
Apr 11, 2016 at 7:12 PM
111976 should fix most of these issues, if you can still please upload a small capture of the multiple marker packets in a frame it would be helpful for me.

Thanks again for bringing this up!
Marked as answer by juliusfriedman on 4/11/2016 at 12:12 PM
Apr 12, 2016 at 6:40 AM
Hi Julius,

Thanks for your reply. Below is a link to the wireshark capture. If you'd like to take a deeper look we can give you remote access to the camera or a teamviewer session to the development machine?

https://dl.dropboxusercontent.com/u/62550666/Wireshark%20capture%2012apr2016.zip

Thanks,

Oelof
Coordinator
Apr 12, 2016 at 4:16 PM
Awesome! Thank you very very much!

I don't think I will need access to the camera very soon as you have provided me with everything I need but if I do I will let you know!

I will also see about getting more support for RFC2198 Depacketization.

I hope the latest changes will be enough to fix this and if not I will work with you to get it solved.

Thank you for your help and let me know if I can do anything further to assist you also!
Marked as answer by juliusfriedman on 4/12/2016 at 9:16 AM
Coordinator
Apr 12, 2016 at 10:30 PM
111980 shows that RFC2198 can be handled, I will add more support soon.

Thanks again for your help with sample captures.

If you need anything else at all please let me know!
Marked as answer by juliusfriedman on 4/12/2016 at 3:30 PM
Apr 13, 2016 at 11:50 AM
Hi Julius,

We found a few null exceptions in the latest change set on RFC6184 ProcessPacket, you said you are busy with changes so we will wait for a new release before testing again.

https://dl.dropboxusercontent.com/u/62550666/Payload%20is%20null.png

Thanks for all your trouble!

Oelof
Coordinator
Apr 13, 2016 at 12:17 PM
No trouble and sorry for the null reference exception.

I will see if I can replicate that and address it quickly, I'm not 100% sure but I think this NRE may be related to the RtspServer only and only because of the way events are dispatched currently, e.g. if the source is disposed in the middle of sending.

I will take a look and update accordingly, do you see this same behavior using the RtspClient alone?

Thanks again!
Marked as answer by juliusfriedman on 4/13/2016 at 5:17 AM