Push (RTSP Record) to Wowza

Nov 6, 2014 at 10:13 PM
I would like to use this library to push h264 frames (nal units) to a Wowza server for redistribution. Is that possible with this library?
Coordinator
Nov 6, 2014 at 10:50 PM
1) This library can also re-distribute the media so you probably don't need Wowza on top of it or any other server or software package.

2) Yes, this library supports Packetization and Depacketization of any type of Rtp Media, work is underway to support those most popular first. e.g. JPEG is complete but requires system decoders until the managed decoders are complete. Playback from a supported Digital Media Container is next (Demuxer) and will allow you to take any video file and re-distribute it on demand per client with complete FF, RW and Pause support (Encoding and Decoding is not needed). That release should also allow you to get the stream data from any supported container to either Packetize or Depacketize it yourself.

3) This library also will eventually support Encoding and Decoding of all video and audio data but that release is a ways off and when completed will allow Packetization and Depacketization from Bitmap / etc, until such time you can see how this is planned to be achieved by looking at the RFC6184 class et al.

If you need anything else let me know!
Marked as answer by juliusfriedman on 11/6/2014 at 2:50 PM
Nov 7, 2014 at 1:12 AM
Edited Nov 7, 2014 at 1:12 AM
Thanks for the quick response. Just to be clear I am talking about using the RTSP Record command to initiate a connection to the wowza server. So I will be the RTSP client and the source of the video stream. I'll need just the RTP Packetization and RTSP, (not encoding or decoding) to push the h264 frames. the workflow would be (something like)

connect to wowza server
announce stream
send RTSP Track info (mode == Record) for each track.
Stream h264 Frames to server

thanks,
mark
Coordinator
Nov 7, 2014 at 1:36 AM
1) You can already connect to wowa and send an announce message.

2) For sending the Rtsp Track info I would imagine the Announce would do that since it contains the SDP.

3) Once you account the stream to Wowza it should automatically start streaming from the source.

In your example it seems like you want to make a H264 Stream from something then announce the H264 Stream to Wowza at which point Wowza will go to the Rtsp / Rtp source and consume the H264 Frames.

In this library you would need to make a RFC6184Media class, start the media and then announce the url of the media to wowza, at which point Wowza will connect to the RtspServer and being to download the frames. (The record mode only tells Wowza to record the data as well as play it back)

If you need anything else let me know!
Marked as answer by juliusfriedman on 11/6/2014 at 6:00 PM
Nov 7, 2014 at 1:52 AM
That's close, but not how it works. In step #3 I push the frames to Wowza. I have this working using the streamcoders product, but I need a mono-friendly implementation. You can refer to libstreaming (in java) for an implementation. Streamcoders also includes this as sample code in their download.

thanks again,
mark
Coordinator
Nov 7, 2014 at 1:57 AM
Edited Nov 7, 2014 at 2:00 AM
Please check your references.

The sendRequestRecord method in that library send the Record request to Wowza.

The Announce is standard, what happens when you announce is up to the server and in this case Wowza allows a Stream to be added for playback or recording with Announce, (My RtspServer can as well with a custom handler)

You can't PUSH to a Rtsp connection because it must be established first, then after established if using RTSP 2.0 you can push messages to the client from the server but Wowza doesn't support this right now and neither does Stream Coders (although it is easier to implement with Stream Coders than Wowza)

When your pushing the frames what really is happening is that Wowza connects to your RtspServer and starts to receive the frames which come from the stream as you create them and send them out Wowza reads them.

It's easier to explain using the RFC2435Media class because it's completely functioning at this time.

Make a RFC2435Media, Packetize some images.

Annouce the URL to Wowza (Can send record with Announce)

Wowza connects to the RtspServer /RFC2435Media and starts consuming frames.

Send Record to Wowza, Wowza starts record (Unless indicated in Announce)

This is completely achievable and already implemented, see the Screen demo in Tests.TestRtspServer which publishes the Desktop over Rtsp.

Let me know if you need anything else!
Marked as answer by juliusfriedman on 11/6/2014 at 6:00 PM
Nov 7, 2014 at 6:36 AM
Hello. I'm working on a similar problem. I need to stream live audio from the client to the server using RTSP RECORD. The client may be behind NAT or firewall.
Here's how it works in FFmpeg:
C->S (RTSP): ANNOUNCE
S->C (RSTP): 200 OK
C->S (RTSP): SETUP (Transport: ...;mode=record...)
S->C (RSTP): 200 OK (Transport: ...;mode=receive...)
C->S (RTSP): RECORD
S->C (RSTP): 200 OK

C->S (RTP) : RTP frames
....

C->S (RTSP): TEARDOWN
Coordinator
Nov 7, 2014 at 6:49 AM
Edited Nov 7, 2014 at 6:54 AM
That looks about right also, probably even more correct then my explanation.

The only thing I would like to point out is that the rfc 2326 10.11 record only specifies an example for a conference and that it ia similar to the get parameter in the sense only certain rules are provided for it leaving the design up to interpret by each implementation.

What if the client is not connected over tcp? How are you going to initiate Nat remotely? possibly you setup the port before but still the firewall must have initial outbound traffic before it allows such inbound which could also be achieved... but without a proper way to specify the ports its best to use setup.

It makes more sense to handle it as I stated in general and wherever allowed certain conditions can allow for derivatives of that logic if and when required.

Just because ff mpeg or another library doesn't do something or does doesn't mean its always correct.

Vlc still doesn't support rtp over tcp ala rfc4751, and even though one of its development team defined the nut container format its own implementation had and may still have bugs due to it supporting older versions of the container.

Also why would you want to push the data to the client? They would very 'well be able to just get it themselves.

In cases where something like this could be required for a specific api it still can be achieved as indicated im just not sure my server will follow such leads when implemented.

Let me know if you need anything else.
Marked as answer by juliusfriedman on 11/6/2014 at 10:49 PM
Nov 7, 2014 at 7:44 AM
Edited Nov 7, 2014 at 7:52 AM
ivas' example was spot on. The client being FFMPEG and the Server being Wowza. Because the client is behind the firewall its connection is not (usually) prohibited.

In ivas; example the C->S (RTP) : RTP frames is the 'streaming' frames I was refering to. I really think ivas and my need is the same (he's looking for audio only).
Nov 7, 2014 at 12:40 PM
I'll tag along since I'm looking for the same as Ivas.

juliusfriedman wrote:
Also why would you want to push the data to the client? They would very 'well be able to just get it themselves.
In my case I want to stream audio from a mobile radiostudio to the primary studio. The "sender" is on a closed router, and I want to start/stop the stream from the "sender".
Coordinator
Nov 7, 2014 at 2:24 PM
Edited Nov 7, 2014 at 2:27 PM
The sender can stop the stream by sending a goodbye or not spending data for a specified period, push is still not required for this and furthermore changes to a stream can use play notify rtsp 2.

The record request could potentially accept frames but I would need to understand if the standard allows this.

The record request really just tells the server to start recording something, it can and should retrieve the data itself for various reasons.

In a situation where exchange is required it can be done already but again my server may not allow this without a custom handler. Ghost
Marked as answer by juliusfriedman on 11/7/2014 at 6:24 AM
Nov 7, 2014 at 3:09 PM
I think for ives and myself the usecase would be the client sending the frames because of network transversal. FFMPEG and VLC are two well known products that support sending data in this way. If you look at the libstreaming project (java, but it's readable to a c# programmer) that also has a sample implementation. I am trying to work with that now.
Coordinator
Nov 7, 2014 at 7:42 PM
Edited Nov 7, 2014 at 7:46 PM
VLC and FFMPEG are the same for all intents and purposes. Without FFMPEG / LibAV VLC wouldn't work.

http://en.wikipedia.org/wiki/Real_Time_Streaming_Protocol

Also See

RFC2326

`14.6 Recording

The conference participant client C asks the media server M to record
the audio and video portions of a meeting. The client uses the
ANNOUNCE method to provide meta-information about the recorded
session to the server.
 C->M: ANNOUNCE rtsp://server.example.com/meeting RTSP/1.0
       CSeq: 90
       Content-Type: application/sdp
       Content-Length: 121

       v=0
       o=camera1 3080117314 3080118787 IN IP4 195.27.192.36
       s=IETF Meeting, Munich - 1
       i=The thirty-ninth IETF meeting will be held in Munich, Germany
       u=http://www.ietf.org/meetings/Munich.html
       e=IETF Channel 1 <ietf39-mbone@uni-koeln.de>
       p=IETF Channel 1 +49-172-2312 451
       c=IN IP4 224.0.1.11/127
       t=3080271600 3080703600
       a=tool:sdr v2.4a6
       a=type:test
       m=audio 21010 RTP/AVP 5
       c=IN IP4 224.0.1.11/127
       a=ptime:40
       m=video 61010 RTP/AVP 31
       c=IN IP4 224.0.1.12/127

 M->C: RTSP/1.0 200 OK
       CSeq: 90

 C->M: SETUP rtsp://server.example.com/meeting/audiotrack RTSP/1.0
       CSeq: 91
       Transport: RTP/AVP;multicast;destination=224.0.1.11;
                  port=21010-21011;mode=record;ttl=127


 M->C: RTSP/1.0 200 OK
       CSeq: 91
       Session: 50887676
       Transport: RTP/AVP;multicast;destination=224.0.1.11;
                  port=21010-21011;mode=record;ttl=127

 C->M: SETUP rtsp://server.example.com/meeting/videotrack RTSP/1.0
       CSeq: 92
       Session: 50887676
       Transport: RTP/AVP;multicast;destination=224.0.1.12;
                  port=61010-61011;mode=record;ttl=127

 M->C: RTSP/1.0 200 OK
       CSeq: 92
       Transport: RTP/AVP;multicast;destination=224.0.1.12;
                  port=61010-61011;mode=record;ttl=127

 C->M: RECORD rtsp://server.example.com/meeting RTSP/1.0
       CSeq: 93
       Session: 50887676
       Range: clock=19961110T1925-19961110T2015

 M->C: RTSP/1.0 200 OK
       CSeq: 93
There is the RECORD Example, as you can see its just a indication to start to record.

ANNOUNCE is to tell a server about a new stream.

The part up for interpretation is that should the server accept frames inline with the RECORD request, and the answer is not unless its proprietary to that implementation.

The reason(s) being is that:

UDP or TCP must be negotiated in a SETUP request (Even in ivas example) This is required to ensure Network Traversal works. (UDP since TCP is already established at that time over Rtsp unless Rtspu is used)

The person sending the frames might want to bail out after they ANNOUNCE and RECORD, why should they have to stay on the wire pumping frames when the Client will have the ability to come to the server and get them at their leisure.

The server RECORDing could run out of space during the RECORD.

There is absolutely NO REASON for the CLIENT to send stream data to the SERVER when the purpose of the SERVER is to consume / AGGREGATE RTP / RTSP STREAMS.

If for some reason your scenario requires that, then it can be achieved at the current time and with the current classes using only the
RtspClientandRtpClientand the only possible scenario I could see this being partly useful is when a device e.g. a Mobile Phone wants to be aRtspServerandRtspClientand even in such a scenario it would be able to simply ANNOUNCE the stream to the SERVER the portion requiring frame transfer inline is still suspect.

Let me know if I can be of any assistance.

I know I won't be adding support for ANNOUNCE or RECORD until the
Muxer` Release at which point I will be more than happy to research and implement this if required but in the end I think it will be move useful and compliant if implemented in the way I describe and not trying to copy another implementation which may have interpreted the standard incorrectly or just aimed to implement something outside of the standard for some reason.
Marked as answer by juliusfriedman on 11/7/2014 at 11:42 AM
Coordinator
Nov 10, 2014 at 7:59 PM
Edited Nov 10, 2014 at 8:02 PM
Guys, I have looked at https://github.com/fyhertz/libstreaming/tree/master/src/net/majorkernelpanic/streaming/rtsp

The RtspServer doesn't even support RECORD.

The RtspClient is able to send a RECORD (just like mine) but there is no pushing of RTP going on.

What, where how and why exactly does this come into play again?

Are you sure you are not just seeing Wowza coming to pick up the stream or is this some implementation of pipelineing since the SETUP request was received it seems its just using the existing connection.

If we had a copy of a complete negotiation including the headers it would be easier to say what exactly is going on.
Coordinator
Apr 15, 2015 at 1:09 AM
Edited Apr 15, 2015 at 1:10 AM
Hello guys, some changes are coming and if you really need push based rtp and rtsp support let me know what scenario you would like to support and I will gladly setup a simple sample on the rtsp server.

The rtsp client is going to replace the client session on the server soon And there will probably also not directly be a RtpClient attached to the RtspClient anymore although you will still be able to get to it if required.

I see one use of this being for live feeds from a camera or laptop to feed multiple end users with either a exact copy of or transcoded copy of.

Another use would be video conferencing, each user would push in their media to the server and optionally the server to the conference or a single user or otherwise.

Provide some feedback and I will happily support something like this in a standards compliant manner with examples but please also keep in mind this is already possible as is with some minor changes to enforce the required scenario (e.g. can anyone submit to a stream and when) and doesn't require RECORD support at all.
Marked as answer by juliusfriedman on 4/14/2015 at 5:09 PM