overriding ModuleBase streaming

Hello, I’ve been having a hard time solving this, so I decided to post here.

What I’m trying to do is basically stream h264+aac content with the addVideoData/addAudioData functions.

I managed so far to implement an IServerNotify2 class that creates a publisher and streams the data correctly, adding the video and audio frames in the publisher.

This works perfectly, but now I’m trying to do the same on a per client basis, in a ModuleBase class as I assume I should.

I manage to take the current stream and add the video/audio data to the stream apparently correct in the play function (meaning that if I grab the current stream from another part of the code and use the ‘createSnapshot’ sample function, I get back perfectly healthy frames).

The problem is the stream never gets sent to the actual player.

In (very) short, what I have now in the module is:

public class streamer extends ModuleBase
{
    ...
    public void PublishVideo(IMediaStream ims)
    {
        ByteArrayBuffer vc = new ByteArrayBuffer();
        ... // create/get the video data to send
        ims.setVideoTC(DTS, true);
        ims.setVideoSize(vc.size());
        ims.startVideoPacket();
        ims.addVideoData(vc.getRawData(), 0, vc.size());
    }
    public void play(IClient client, RequestFunction function, AMFDataList params)
    {
        IMediaStream ims = client.getAppInstance().getStreams().getStream(client, function.getSrc());
        ... // set stuff on stream
        while(true)
        {
            PublishVideo(ims);
        }
    }
}

Now, before anyone asks, the problem isn’t related to the actual video content/headers I pass in, as the same functions are used in the publisher, and that works fine, only difference I see is that in the publisher I have a clientless stream that I get from startup, and I add frames to it through the publisher, and here I’m trying to add frames directly into a client stream I get on ‘play’.

Any suggestions will be welcome :slight_smile:

You can get to the IMediaStream in play command like this:

public void play(IClient client, RequestFunction function, AMFDataList params)
    {
        IMediaStream ims = getStream(client, function);
    }

And there is higher-level method for injecting metadata:

https://www.wowza.com/docs/how-to-inject-cue-points-or-metadata

Richard

Not sure if I understand. but take a look at IMediaStreamActionNotify2 onPlay handler:

https://www.wowza.com/docs/imediastreamactionnotify2-example

Also, the cuepoint example is shown in the context of a Flash controller, but you could factor that out.

Richard

I’m really not sure how to help manufacture a stream in that way. It is very simple to remap a stream in play command:

https://www.wowza.com/docs/how-to-override-play-to-remap-a-stream-name

There are things you can do at a low-level in Wowza using Publisher API, but there aren’t any examples like that.

Richard

There are a couple of MediaReader examples on the forum. There is one in the 2nd half of this post:

https://www.wowza.com/docs/how-to-insert-a-pre-roll-or-mid-roll-for-video-on-demand-playback-in-flash-rtmp-client

Richard

I am helping as I can, you said you couldn’t find a MediaReader example, but as I said I’m not really sure how to manufacture a stream as you describe. I don’t have experience doing that.

We have a list of consultants you can talk to, some of them probably have this kind of experience. If you want that list, send a request to sales@wowza.com. Include a reference to this thread.

Richard

Nitin,

Take a look at this method:

https://www.wowza.com/docs/how-to-loop-a-pre-roll-until-a-live-stream-starts-loopuntillive

Richard

Yes, in the streamschedule.smil /playlist set repeat=“false”

<smil>
    <head>
    </head>
    <body>
        <stream name="Stream1"></stream>
        
        <playlist name="pl1" playOnStream="Stream1" repeat="false" scheduled="2009-12-11 16:00:00">
            <video src="mp4:sample.mp4" start="0" length="20"/>
        </playlist>
    </body>
</smil>

Richard

Thank you for your help, Richard. Exelle and I are working together, I will contact sales regarding this issue.

From what I understand, the cuepoint example inserts data on the client’s request. I’m trying to override the normal streaming i.e. send my own frames in the stream instead of the requested content.

What I wrote above adds the frames to the stream, but blocks the rest of the circuit until done, so the player(LivePlayer,PlaylistPlayer,etc) associated with the stream never sends the actual frames to the client. In comparison, in a normal module, ‘play’ would get called, do some stuff, finish, and afterwards a player in background handles sending the content to the client. I think I need to implement a callback that would get called when the frames are just about to be sent to the client, and override it there, or tell the stream to use my own receiver instead of, for example, the LiveReceiver class.

I tried the onPlay handler in IMediaStreamActionNotify2, same thing it blocks the stream until I finish.

About the cuepoint example, how can I do it without having the player ask me every time for a packet ?

Reiterating what I’ve been trying to say: how can I simply add my own video/audio frames to the stream from a ModuleBase class instead of letting him play whatever content the client asked for.

Simplest example: normally, a client (let’s assume for simplicity’s sake, the live.html example page is used) connects to my server asking for the ‘test.mp4’ stream. normally, the wowza server starts sending the requested content to the client, and in the mean time, calls my module’s functions on big events (onAppStart, onConnect, play, etc.) so I can gather statistics and other stuff.

What I want to do is to have the client connect exactly like he usually does, but instead of letting the wowza server give him his requested content, I want to take over and send the video/audio content myself once my module gets called on play (using something simple like addVideoData/addAudioData to the client’s stream)

After parsing most of the wowza api, I managed to find 2 possible solutions to my problem, but I need to know if there aren’t any problems and side effects with any of them:

#1. I managed to add my own video/audio frames on a per client basis like so:

  • I have a module containing the ‘play’ function

  • when ‘play’ gets called in my module, I change the requested stream name, just like in the remap stream example above.

  • I then publish a new stream with the changed stream name

  • I sleep until the stream actually gets published

  • and then I continue with invokePrevious from the ‘play’ function

basically, the following minimal code:

	
public void play(IClient client, RequestFunction function, AMFDataList params)
{		
    params.set(3, newRandomName);
				
    Thread t = new Thread(new FramePublisher(newRandomName)); 
    t.start();
    // the frame publisher creates a new publisher with the given stream name and
    // starts adding video and audio frames
		
    while(newRandomName no published yet)
        Thread.sleep();
    invokePrevious(client, function, params);
}

this works in my tests, but what worries me is if it’s accepted, if it doesn’t have any side effects I don’t see at the moment, and if the api’s behavior won’t change with wowza upgrades and patches as it’s not a documented, publicly accepted usage of the Publisher API.

#2. The other solution I could find, is implement my own MediaReader class, add it to the ‘MediaReaders.xml’ file, and inside my reader class I should take care of adding my own frames instead of reading them from a file, like MediaReaderH264. The problem here is that I don’t have any examples regarding the MediaReader like I had for the Publisher API.

what I need to know if I can rely on the publisher api to work like I described above (a publisher created for each client connected, and stopped at disconnect), or should I write my own MediaReader (for which I would really need an example from you guys)

I saw the midroll example, and it’s not helping… I don’t want to send metadata, I want to send the actual video and audio frames. For example, the publisher api specified exactly what header bytes it wanted for video/audio frames.

And you completely overlooked the other part of my above post, regarding the usage of the publisher api, as the examples I’ve seen create a publisher at the server start-up in a IServerNotify class, not at client play time in a ModuleBase class - for which I was asking if it’s intended behavior to allow me to create a publisher for each and every client coming along.

After parsing most of the wowza api, I managed to find 2 possible solutions to my problem, but I need to know if there aren’t any problems and side effects with any of them:

#1. I managed to add my own video/audio frames on a per client basis like so:

  • I have a module containing the ‘play’ function

  • when ‘play’ gets called in my module, I change the requested stream name, just like in the remap stream example above.

  • I then publish a new stream with the changed stream name

  • I sleep until the stream actually gets published

  • and then I continue with invokePrevious from the ‘play’ function

basically, the following minimal code:

	
public void play(IClient client, RequestFunction function, AMFDataList params)
{		
    params.set(3, newRandomName);
				
    Thread t = new Thread(new FramePublisher(newRandomName)); 
    t.start();
    // the frame publisher creates a new publisher with the given stream name and
    // starts adding video and audio frames
		
    while(newRandomName no published yet)
        Thread.sleep();
    invokePrevious(client, function, params);
}

this works in my tests, but what worries me is if it’s accepted, if it doesn’t have any side effects I don’t see at the moment, and if the api’s behavior won’t change with wowza upgrades and patches as it’s not a documented, publicly accepted usage of the Publisher API.

Hi Excelle,

I am having similar problem like you did several months ago of adding few video frames before a live stream. I see that you have got it working using Publisher API. I tried doing it the way you have. But I am not sure what you have used for Publisher API? Is it the publisher class (wowza server side API doc page #1289) you are referring to? Can you give any hints as to what you have used within FramePublisher above?

I am badly stuck and this be really helpful. Thanks in advance

Rgds

Nitin

Hi Richard,

Thanks for the quick reply. I checked out the link. Is it possible to stream out sample.mp4 only once instead of looping it?

Nitin