Increase LIVE stream delay

Hello,

For clients to receive a smooth continues streaming we want to make a server-side (buffer) delay to RTSP streaming out.

Furthermore, if in case encoder is disconnected the client should not be informed and they should try to connect the stream as usual instead of disconnection.

Please advise what to tweak.

-Mamoor

Not sure what you mean by this. Please explain in much more detail. We do hold 8 seconds worth of video server-side in a buffer but I am not sure specifically what you are asking for here related to RTSP.

Charlie

No, this will not increase the delay.

Charlie

The 8 second internal buffer is not configurable.

Richard

Try this package:

https://www.wowza.com/downloads/forums/publishwithdelay/PublishWithDelay.zip

It is old, but it has been used by Wowza 3 customers with success for short delays.

Richard

Sorry for the late reply. I think it would just work to inject into the source stream, the cuepoints should be in the delayed stream. If you have Flex Builder, here is a quick Flash front-end for injecting to a live stream.

MXML

<?xml version="1.0" encoding="utf-8"?>
<local:Player width="504" height="470"
			  xmlns:mx="http://www.adobe.com/2006/mxml" 
			  xmlns:local="*"
			  layout="absolute"
			  backgroundColor="#FFFFFF"
			  backgroundAlpha="0" xmlns:mx1="library://ns.adobe.com/flex/mx">
	<mx:UIComponent id="videoContainer" x="2"/>
	<mx:Text id="prompt" x="320" y="429" width="136"  height="27"/>
	<mx:Canvas x="6" y="5" width="498" height="455">
		<mx:Form x="11" y="208" width="487">
			<mx:FormItem label="Server:" labelStyleName="bold">
				<mx:TextInput id="connectStr" width="263" paddingLeft="0" text=""/>
			</mx:FormItem>
			<mx:FormItem label="Stream:" labelStyleName="bold" direction="horizontal">
				<mx:TextInput id="streamStr" width="205" text=""/>
				<mx:Button label="Play" id="connectButton"/>
			</mx:FormItem>
			
			<mx:FormItem label="Caption" labelStyleName="bold" direction="horizontal">
				<mx:TextInput id="captionLanguage" width="26" text="en"/>
				<mx:TextInput id="captionTrackID" width="26" text="1"/>
				<mx:TextInput id="captionText" width="205" text=""/>
				<mx:Button label="Inject" id="captionButton"/>
			</mx:FormItem>
		</mx:Form>
		<mx:Image id="logo" source="assets/logo.png" x="12" y="389"/>
		<mx1:Text id="captionOutput" x="187" y="336" width="300" height="32" text="captionOutput"/>
		<mx:Button id="doFullscreen"  visible="{isConnected}" icon="@Embed(source='assets/fullscreen.png')" width="30" x="360" y="2"/>
		<mx:HSlider id="volumeLevel" value=".5" maximum="1"
					enabled="{isConnected}"
					labels="volume"
					labelOffset="2"  x="10" y="326"/>
		<mx:Text id="playerVersion" x="259.5" y="403" width="223.5"/>
		<mx:Text id="fpsText" x="302" y="378" width="167.5"/>
	</mx:Canvas>
	<mx:Style>
		.alert {color:"0xFF00FF"}
		.bold {font-weight:bold;padding-right:0}
	</mx:Style>
</local:Player>

Actionscript

package
{
	import com.wowza.encryptionAS3.TEA;
	
	import flash.events.*;
	import flash.external.ExternalInterface;
	import flash.geom.*;
	import flash.media.SoundTransform;
	import flash.media.Video;
	import flash.net.NetConnection;
	import flash.net.NetStream;
	import flash.net.Responder;
	import flash.system.Capabilities;
	import flash.system.Security;
	import flash.utils.clearInterval;
	import flash.utils.setInterval;
	import flash.utils.setTimeout;
	
	import mx.controls.Button;
	import mx.controls.HSlider;
	import mx.controls.Image;
	import mx.controls.Text;
	import mx.controls.TextInput;
	import mx.core.Application;
	import mx.core.UIComponent;
	import mx.events.FlexEvent;
	import mx.events.SliderEvent;
	
	public class Player extends Application
	{	
		Security.LOCAL_TRUSTED;
		
		public var movieName:String = "Stream1";
		public var serverName:String = "rtmp://localhost/live";
		private var sharedSecret:String = "test";
		private var nc:NetConnection = null;
		[Bindable]
		public var isConnected:Boolean = false;
		private var nsPlay:NetStream = null;
		private var duration:Number = 0;
		private var progressTimer:Number = 0;
		private var isPlaying:Boolean = false;	
		private var videoObj:Video;
		private var isProgressUpdate:Boolean = false;
		private var fullscreenCapable:Boolean = false;
		private var hardwareScaleCapable:Boolean = false;
		public var doRewind:Button;
		public var doFastRev:Button;
		public var doFastFwd:Button;
		public var doPlay:Button;
		public var connectButton:Button;
		public var captionButton:Button;
		public var captionText:TextInput;
		public var captionName:TextInput;
		public var captionLanguage:TextInput;
		public var captionTrackID:TextInput;
		public var captionOutput:Text;
		public var doSlowMotion:Button;
		public var doFullscreen:Button;
		public var slider:HSlider;
		private var t:SoundTransform = new SoundTransform();
		private var params:Object = new Object();
		public var volumeLevel:HSlider;
		public var isScrubbing:Boolean;
		public var videoContainer:UIComponent;
		private var loc:String;
		public var connectStr:TextInput;
		public var streamStr:TextInput;
		public var logo:Image;
		public var fpsText:Text;
		public var playerVersion:Text;
		public var prompt:Text;
		private var saveVideoObjX:Number;
		private var saveVideoObjY:Number;
		private var saveVideoObjW:Number;
		private var saveVideoObjH:Number;
		private var saveStageW:Number;
		private var saveStageH:Number;
		private var adjVideoObjW:Number;
		private var adjVideoObjH:Number;
		private var videoSizeTimer:Number = 0;
		private var videoLastW:Number = 0;
		private var videoLastH:Number = 0;
		private var debugInterval:Number = 0;
		private var bufferTime:Number = 3;
		private var nsPlayClientObj:Object = new Object();
		
		public function Player()
		{
			addEventListener(FlexEvent.APPLICATION_COMPLETE,mainInit);
			addEventListener(FullScreenEvent.FULL_SCREEN, enterLeaveFullscreen );
		}
		
		private function mainInit(event:FlexEvent):void
		{	
			stage.align="TL";
			stage.scaleMode="noScale";
			
			// get movie name from parameter is defined
			if (loaderInfo.parameters.zmovieName != undefined)
				movieName = loaderInfo.parameters.zmovieName;
			
			if (loaderInfo.parameters.zserverName != undefined)
				serverName = loaderInfo.parameters.zserverName;
			
			videoObj = new Video();
			videoContainer.addChild(videoObj);
			videoObj.width = 400;
			videoObj.height = 300;
			saveVideoObjX = videoObj.x;
			saveVideoObjY = videoObj.y;
			adjVideoObjW = (saveVideoObjW = videoObj.width);
			adjVideoObjH = (saveVideoObjH = videoObj.height);
			
			volumeLevel.addEventListener(SliderEvent.CHANGE,adjustVolume);
			doFullscreen.addEventListener(MouseEvent.CLICK,enterFullscreen);
			
			streamStr.text = movieName;
			connectStr.text = serverName;
			connectButton.addEventListener(MouseEvent.CLICK,connectLivePlayer);
			
			captionButton.addEventListener(MouseEvent.CLICK, injectCaption);
			
			fullscreenCapable = testVersion(9, 0, 28, 0);
			hardwareScaleCapable = testVersion(9, 0, 60, 0);
			
			if (ExternalInterface.available && Application.application.url.search( /http*:/ ) == 0)
			{
				loc = ExternalInterface.call("function(){return window.location.href;}");
				trace("This player served from: " + loc); // you can do client-side hotlink denial here
			}
			
			var h264Capable:Boolean = testVersion(9, 0, 115, 0);
			playerVersion.text = (h264Capable?"H.264 Ready (":"No H.264 (")+Capabilities.version+")";
			
			if (!h264Capable)
				playerVersion.styleName="alert";				
		}
		
		private function injectCaption(event:MouseEvent):void
		{
			if (nc.connected)
			{
				nc.call("setCaption", null, streamStr.text, captionText.text, captionLanguage.text, captionTrackID.text);
			}
			
		}
		
		private function ncOnStatus(infoObject:NetStatusEvent):void
		{
			trace("nc.onStatus: "+infoObject.info.code+" ("+infoObject.info.description+")");
			for (var prop:String in infoObject)
			{
				trace("\t"+prop+":\t"+infoObject.info[prop]);
			}
			
			// once we are connected to the server create the nsPlay NetStream object
			if (infoObject.info.code == "NetConnection.Connect.Success")
			{
				if (infoObject.info.secureToken != undefined) //<--- SecureToken change here - respond with decoded secureToken
				{
var secureResult:Object = new Object();
secureResult.onResult = function(isSuccessful:Boolean):void
{
	trace("secureTokenResponse: "+isSuccessful);
}
nc.call("secureTokenResponse", new Responder(secureResult.onResult), TEA.decrypt(infoObject.info.secureToken, sharedSecret));		
				}
				
				isConnected = true;
				playLiveStream();
				videoLastW = 0;
				videoLastH = 0;
				videoSizeTimer = setInterval(updateVideoSize, 500);
			}
			else if (infoObject.info.code == "NetConnection.Connect.Failed")
				prompt.text = "Connection failed: Try rtmp://[server-ip-address]/simplevideostreaming";
			else if (infoObject.info.code == "NetConnection.Connect.Rejected")
				if(infoObject.info.ex) 
					if (infoObject.info.ex.code == 302) {
						setTimeout(function():void{
							trace("Redirect to: " + arguments[0]);
							nc.connect(arguments[0]);
						},100,infoObject.info.ex.redirect);	
					}
					else
					{
						prompt.text = infoObject.info.description;
					}
		}
		
		private function connectLivePlayer(event:MouseEvent):void
		{
			if (nc == null)
			{
				//enablePlayControls(true);
				nc = new NetConnection();
				nc.addEventListener(NetStatusEvent.NET_STATUS, ncOnStatus);
				nc.connect(connectStr.text);
				
				// uncomment this to monitor frame rate and buffer length
				// debugInterval = setInterval(updateStreamValues, 500);
				
				connectButton.label = "Stop";
			}
			else
			{
				videoObj.attachNetStream(null);
				videoObj.clear();
				videoObj.visible = false;
				duration = 0;
				
				nc.close();
				nc = null;
				
				if (debugInterval > 0)
					clearInterval(debugInterval);
				debugInterval = 0;
				
				connectButton.label = "Play";
				prompt.text = "";
				isConnected = false;
			}
		}	
		
		// function to monitor the frame rate and buffer length
		private function updateStreamValues():void
		{
			var newVal:String = "";
			if (nsPlay != null)
				newVal = (Math.round(nsPlay.currentFPS*1000)/1000)+" fps/"+(Math.round(nsPlay.bufferLength*1000)/1000)+" secs";
			fpsText.text = newVal;
		}
		
		private function nsOnStatus(infoObject:NetStatusEvent):void
		{
			trace("onStatus: ");
			for (var propName:String in infoObject.info)
			{
				trace("  "+propName + " = " + infoObject.info[propName]);
			}
			
			if (infoObject.info.code == "NetStream.Play.Start")
				isProgressUpdate = true;
			else if (infoObject.info.code == "NetStream.Play.StreamNotFound" || infoObject.info.code == "NetStream.Play.Failed")
				prompt.text = infoObject.info.description;
		}
		
		// create the nsPlay NetStream object
		private function playLiveStream():void
		{
			nsPlay = new NetStream(nc);
			nsPlay.addEventListener(NetStatusEvent.NET_STATUS, nsOnStatus);
			
			
			nsPlay.client = nsPlayClientObj;
			
			nsPlayClientObj.onTextData = function(obj:Object):void
			{
				captionOutput.text = obj.text;
			}
			
			nsPlayClientObj.onMetaData = function(infoObject:Object):void
			{
				trace("onMetaData");
				
				// print debug information about the metaData
				for (var propName:String in infoObject)
				{
					trace("  "+propName + " = " + infoObject[propName]);
				}
			};	
			// print debug information when we play status changes
			nsPlayClientObj.onPlayStatus = function(infoObject:Object):void
			{
				trace("onPlayStatus");
				for (var prop:String in infoObject)
				{
					trace("\t"+prop+":\t"+infoObject[prop]);
				}
			};
			//SET BUFFERTIME TO 0 for injectionPurposes
			// set the buffer time and attach the video and audio
			nsPlay.bufferTime = 0;
			
			// subscribe to the named stream
			nsPlay.play(streamStr.text);	
			
			videoObj.attachNetStream(nsPlay);
		}
		
		
		private function updateVideoSize():void
		{
			trace("updateVideoSize: "+stage["displayState"]);
			
			// when we finally get a valid video width/height resize the video frame to make it proportional
			if (videoObj.videoWidth != videoLastW || videoObj.videoHeight != videoLastH)
			{
				videoLastW = videoObj.videoWidth;
				videoLastH = videoObj.videoHeight;
				
				var videoAspectRatio:Number = videoLastW/videoLastH;
				var frameAspectRatio:Number = saveVideoObjW/saveVideoObjH;
				
				adjVideoObjW = saveVideoObjW;
				adjVideoObjH = saveVideoObjH;
				if (videoAspectRatio > frameAspectRatio)
					adjVideoObjH = saveVideoObjW/videoAspectRatio;
				else
					adjVideoObjW = saveVideoObjH*videoAspectRatio;
				
				videoObj.width = adjVideoObjW;
				videoObj.height = adjVideoObjH;
				videoContainer.width = videoObj.width;
				videoContainer.height = videoObj.height;
				videoObj.visible = true;
			}
			else
				clearInterval(videoSizeTimer);
		}
		
		// show/hide the controls when we enter/leave fullscreen
		private function hideAllControls(doHide:Boolean):void
		{
			fpsText.visible = !doHide;
			logo.visible = !doHide;
			connectButton.visible = !doHide;
			doFullscreen.visible = !doHide;
			slider.visible = !doHide;
			playerVersion.visible = !doHide;
		}
		
		private function enterLeaveFullscreen(fsEvent:FullScreenEvent):void
		{
			trace("enterLeaveFullscreen: "+fsEvent.fullScreen);
			
			hideAllControls(fsEvent.fullScreen);
			if (!fsEvent.fullScreen)
			{
				// reset back to original state
				stage.scaleMode = "noScale";
				trace("adjVideoObjW 1: " + adjVideoObjW);
				trace("adjVideoObjH 1: " + adjVideoObjH);				
				videoObj.width = adjVideoObjW;
				videoObj.height = adjVideoObjH;
				videoObj.y = saveVideoObjY + saveVideoObjH - adjVideoObjH;
				videoObj.x = (saveStageW - adjVideoObjW)/2;
			}
		}
		
		private function enterFullscreen(event:MouseEvent):void
		{
			trace("enterFullscreen: "+hardwareScaleCapable);
			if (hardwareScaleCapable)
			{
				// best settings for hardware scaling
				videoObj.smoothing = false;
				videoObj.deblocking = 0;
				
				// grab the portion of the stage that is just the video frame
				stage["fullScreenSourceRect"] = new Rectangle(
					videoObj.x, videoObj.y, 
					videoObj.width, videoObj.height);
			}
			else
			{
				stage.scaleMode = "noBorder";
				
				var videoAspectRatio:Number = videoObj.width/videoObj.height;
				var stageAspectRatio:Number = saveStageW/saveStageH;
				var screenAspectRatio:Number = Capabilities.screenResolutionX/Capabilities.screenResolutionY;
				
				// calculate the width and height of the scaled stage
				var stageObjW:Number = saveStageW;
				var stageObjH:Number = saveStageH;
				if (stageAspectRatio > screenAspectRatio)
					stageObjW = saveStageH*screenAspectRatio;
				else
					stageObjH = saveStageW/screenAspectRatio;
				
				// calculate the width and height of the video frame scaled against the new stage size
				var fsVideoObjW:Number = stageObjW;
				var fsVideoObjH:Number = stageObjH;
				if (videoAspectRatio > screenAspectRatio)
					fsVideoObjH = stageObjW/videoAspectRatio;
				else
					fsVideoObjW = stageObjH*videoAspectRatio;
				
				// scale the video object
				videoObj.width = fsVideoObjW;
				videoObj.height = fsVideoObjH;
				videoObj.x = (stageObjW-fsVideoObjW)/2.0;
				videoObj.y = (stageObjH-fsVideoObjH)/2.0;
			}
			stage["displayState"] = "fullScreen";	
		}
		
		private function playStream():void
		{
			var timecode:Number = nsPlay.time;
			isProgressUpdate = false;
			
			if (!isPlaying)
				nsPlay.resume();
			nsPlay.seek(timecode);
			isPlaying = true;	
		}
		
		private function doPlayToggle(event:MouseEvent):void
		{			
			if (!isPlaying)
			{
				playStream();
				doPlay.label = "pause";
			}
			else
			{
				doPlay.label = "play";
				isProgressUpdate = false;
				isPlaying = false;
				nsPlay.pause();
			}
		}
		
		public function streamRewind(event:Event):void
		{
			if (nsPlay==null) return;
			slider.value=0;
			nsPlay.seek(0);
		}
		
		public function adjustVolume(event:SliderEvent):void
		{
			if (nsPlay==null) return;
			
			var vol:Number;
			
			if (event==null)
			{
				vol = volumeLevel.value	
			} else {
				vol = event.value;
			}
			
			t.volume = vol;
			try{	
				nsPlay.soundTransform = t;
			}
			catch(e:Error)
			{
				trace(e.message);
			}
		}
		
		public function movieSeek(event:Event):void
		{
			if (nsPlay == null) return;
			
			if (doPlay.styleName=="pauseButton")
			{
				nsPlay.resume();
			}			
			nsPlay.seek(slider.value);			
		}
		
		private function testVersion(v0:Number, v1:Number, v2:Number, v3:Number):Boolean
		{
			var version:String = Capabilities.version;
			var index:Number = version.indexOf(" ");
			version = version.substr(index+1);
			var verParts:Array = version.split(",");
			
			var i:Number;
			
			var ret:Boolean = true;
			while(true)
			{
				if (Number(verParts[0]) < v0)
				{
					ret = false;
					break;
				}
				else if (Number(verParts[0]) > v0)
					break;
				
				if (Number(verParts[1]) < v1)
				{
					ret = false;
					break;
				}
				else if (Number(verParts[1]) > v1)
					break;
				
				if (Number(verParts[2]) < v2)
				{
					ret = false;
					break;
				}
				else if (Number(verParts[2]) > v2)
					break;
				
				if (Number(verParts[3]) < v3)
				{
					ret = false;
					break;
				}
				break;
			}
			trace("testVersion: "+Capabilities.version+">="+v0+","+v1+","+v2+","+v3+": "+ret);	
			return ret;
		}			
	}
}

Richard

The only one which is possible with a default Wowza install is the middle one.

For low latency streaming. See here

https://www.wowza.com/docs/how-to-set-up-low-latency-applications-in-wowza-streaming-engine-for-rtmp-streaming

for the others custom modules would be required, fairly simple but time required to build.

Shamrock

Richard, thank you for posting this module. I had tried accomplishing something similar in the past as an experiment, and this source code shows a nice technique. Would you be able to tell me the best way to set additional data (AMF data such as onTextData) on the delayed stream? Essentially, I want to create a viewer with the non-delayed stream that has a live caption injector (Flash front-end). I’ve added a caption queue (ArrayList) to the PublishWithDelayWorker class, and have AMFData and adjusted timecode values based on the source stream stored there, but I don’t know how to match that source timecode value to a timecode value in the while() loop creating the packets for the delayed stream. Any pointers would be appreciated.

Hello Charlie,

We want to add a delay of more then 8 seconds.

  • Some streams we want 1 minute of hold

  • Some streams we want a very low-latency (as low as 1 second)

  • Some streams we want a hold of 1 hour

Where i can set those values.

-Mamoor

Can anyone help me with the development of 1st and 3rd option

-Mamoor

Can we start a delay stream in RTSP in nDVR.

And how much delay i can set in nDVR.

Are there any know problems ?

-Mamoor

We need to stream over RTSP.

Charlie told that 8 second video is hold on server side.

Can we increase this value from 8 seconds to 60 seconds.

If not than how much we can change it.

-Mamoor

What about HLS, can we increase server-side delay to HTTP Live streaming ?

We need 1 minute buffer on server-side.

What if we increase the value of ReceiveBufferSize and SendBufferSize to a much higher value.

Will it increase the delay in the stream and i will get a desired 1 minute buffer mechanism on the server ?

-Mamoor

Hi

You could try and set up a system that records the live stream and plays it in a loop that appears to be live.

This content will be Vod and can be 1hour behind the live stream and if each segment is 1hour long and put in a playlist this should sort the 1hour delay.

The 1 minute delay can be done by the nDVR addon that can be found here,

https://www.wowza.com/docs/how-to-set-up-and-run-wowza-ndvr-for-live-streaming

You can ask for the independent consulting list my email, support@wowza.com

Jason

Hi

You can only playback nDVR over http, this list is in the article i showed you.

Playback of recorded streams via HTTP Streaming:

Cupertino (iOS)

Smooth Streaming (Silverlight)

San Jose Streaming (Adobe Flash)

Multi-bitrate stream support (currently assumes streams are time aligned)

It sounds like you can have as much recording time as you can handle with your free disk space.

Read the questions and answers on the bottom of this page,

https://www.wowza.com/docs/how-to-set-up-and-run-wowza-ndvr-for-live-streaming

This may answer you question about any known problems, if you consider them to be problems that is.

Jason

Hi

No, I’m afraid its not.

Jason

Hello,

I wand to add one hour delay in live streaming.

Can you please help me on that.

Thanks

Dinesh

Hi Dinesh,

To create a live stream that has a 1 hour delay you can record the original stream ensuring that you segment the recording. I recommend the recording splits the video every 30 minutes. Using the recorded content you can the create a live stream using the Stream Publisher Module and this scheduled live stream would be simply be scheduled 1 hour behind the start time of the original live stream creating the 1 hour delay you require.

Note: The schedule will need to be updated with each of the new 30 minute recordings for the scheduled live stream to continue seamlessly. The recordings can’t be 60 minutes long as they wouldn’t be ready/finished in time before they’re needed in the schedule.

Regards,

Jason