What about the idea using shared objects instead of cuepoints for captions?
The clients will see the captions faster than with cuepoints but on the other hand the captions will be lost unless I save it on the server in a separate file. I use different shared objects for different languages.
Do u think that this is feasible?
Once cuepoints are injected into the a live stream.
Is it possible to edit & change the cuepoints afterwards?
We need to correct the subtitle text if there are some errors.
Can u give me some hints how to do it with external xml files?
Maybe a link?
I know the plugin. But we cant use them for live broadcasts.
At the moment I implemented a solution with shared objects. But the problem is that the subtitles are 20 sec ahead the video. The stream is buffering too much.
During life broadcast we do subtitles in 6 languages. I rewrote the captions plugin to make it fit our needs. I also wrote a tool for our translators.
The tool is sending instantly the text using shared objects to the viewers and at the end of the broadcast an xml file is saved containing the subtitles.
If I switch from shared objects to cuepoints is it possible to change the cuepoints after the broadcast?
Or I simply oppress the cuepoints and use the xml file.
I dont quite understand how to read the cuepoints from a live stream.
U wrote:
// set up netstream for the callback on the live stream
nsPlayClientObj:Object = new Object();
nsPlayClientObj.setCaption = function(caption:String):void
{
trace(caption);
}
nsPlay.client = nsPlayClientObj;
Can u be more specific?
I made a new NetStream class & added a handler.
ns = new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS, statusHandler);
Buw where should I place your code so that it reads the cue points when there is one?
Sorry! I don’t get it!
Where do I need to put that code?
Where is the variable which contains the text of the cuepoint?
You wrote: This is ActionScript used in Flash CS authoring tool, or Flex/Flash Builder IDE.
Of course! What else would it be?
No I dont want that list.
I want that I get get a better support here if possible.
This is my wowza code:
double pTime = System.currentTimeMillis();
AMFDataMixedArray data = new AMFDataMixedArray();
data.put(“publishTime”, new AMFDataItem(pTime));
stream.sendDirect(“setPublishTime”, data);
This my JW Player code:
public function setPublishTime(obj:Object):void {
trace("NetClient-setPublishTime: "+ obj.publishTime );
}
and setting a client to the NetStream: _stream.client = new NetClient(this);
When I use stream.send instead of stream.sendDirect it works.
I’m lost! Why doesn’t sendDirect work. Am I missing here something?
@
Richard Lanham
hi rechard
this is muni i am developing an android application which will need to be llive video streaming from androidclient to wowza server and it will play by using in opencv player actually i want to send the metadata events/custom events to players along with stream i.e when some sensor is fired i want to notify the player some event is fired but synchronusly and at which time sensor event fired is sync with video and publish to player and on the specific time only but i am send events through wowzabroadcaster.sendDateEvent() but it will sending events asynchronously not fired with respect to video it will fire instantly but playing along with video.
my requirement is i want send sensor information along with syncing video at which time sensor is fired it will send at whcih time video is playing in player on same video time is played at the same time the sensor information is notify i.e how captioning is given like that