Custom module to create single frame snapshots of live and VOD stream

I’m curious if there is public documentation of the structure of the keyframe data in an AMFPacket. If there were, I would be happy to take a stab at converting it to a bitmap which would then make it relatively simple to drop to disk as a jpg or png or without requiring an external process to fire off ffmpeg.

Also, it has been more than 6 months since your note about h.264 snapshots. Any change of that status? Again, if I can get the structure of the keyframe buffer documented somewhere, it can’t be too difficult to turn it into an uncompressed bitmap (I hope).

Could one create a subclass of the default MediaWriter class (or an implementation of IMediaWriter that delegates to the default) and execute the snapshot code directly in the writePackets() method? It would sure be nice to have snapshots waiting as soon as the stream completes writing. If so, would it be possible to get some example code? I can probably figure it out from the existing examples, but it is always nicer to have explicit instructions. I can inspect the jar files and see that I likely need to subclass:

com/wowza/wms/mediawriter/flv/MediaWriterFLV.class

com/wowza/wms/mediawriter/h264/MediaWriterH264.class

Perhaps it is possible to write multiple keyframes into a single file and then tell ffmpeg to extract every frame of the snapshot movie? that would be a convenient way to grab a snapshot every 30 seconds or so.

Also, I’m excited to hear that h.264 snapshots are now working. Time to start playing with switching over, I guess.

Hi Charlie,

I implemented the following class based on your sample code for creating snapshots (with necessary modifications to call this from another java class method):


public class WMSSnapshotCreator extends ModuleBase {

Object lock = new Object();

public void createSnapshotLive(IApplicationInstance appInstance, String streamName) {

getLogger().info("********** WMSSnapshotCreator::createSnapshotLive(): Stream Name: " + streamName + " **********");

String fileName = “”;

MediaStreamMap streams = appInstance.getStreams();

IMediaStream stream = streams.getStream(streamName);

if(stream != null) {

AMFPacket packet = stream.getLastKeyFrame();

if (packet != null) {

fileName = streamName + “_” + packet.getAbsTimecode() + “.flv”;

//getLogger().info("********** WMSSnapshotCreator::createSnapshotLive(): fileName: " + fileName + " **********");

File newFile = stream.getStreamFileForWrite();

String filePath = newFile.getPath().substring(0, newFile.getPath().length() - 4) + “_” + packet.getAbsTimecode() + “.flv”;

//getLogger().info("********** WMSSnapshotCreator::createSnapshotLive(): FLV Thumbnail File Path: " + filePath + " **********");

//get the flv thumbnail file name alone from the file path

int fileNameIndex = 0 ;

//get the index of last occurence of ‘’ in the file path

fileNameIndex = filePath.lastIndexOf(’\’) ;

String flvFileName = “” ;

if(fileNameIndex != -1) {

flvFileName = filePath.substring(fileNameIndex + 1) ;

} else {

flvFileName = null ;

}

getLogger().info("********** WMSSnapshotCreator::createSnapshotLive(): FLV Thumbnail File Name: " + flvFileName + " **********");

try {

synchronized(lock) {

BufferedOutputStream out = new BufferedOutputStream(new FileOutputStream(new File(filePath), false));

FLVUtils.writeHeader(out, 0, null);

FLVUtils.writeChunk(out, packet.getDataBuffer(), packet.getSize(), 0, (byte)packet.getType());

out.close();

}

getLogger().info("********** WMSSnapshotCreator::createSnapshotLive(): Snapshot Created Successfully **********");

} catch (Exception e) {

getLogger().error("********** WMSSnapshotCreator::createSnapshotLive(): Error Creating Snapshot !!!" + e.toString() + " **********");

}

}

} else {

getLogger().error("********** WMSSnapshotCreator::createSnapshotLive(): Error Creating Snapshot - NULL Stream !!!" + " **********");

}

}

} //End of WMSSnapshotCreator


I am calling createSnapshotLive() using the following code:


WMSSnapshotCreator creator = new WMSSnapshotCreator() ;

creator.createSnapshotLive(appInstance, streamName) ;

with proper values for appInstance and streamName.

Behavior noticed:

An .flv file with a single frame gets created (with a size of about 3 KB) as expected.

NOTE:

My streamtype is ‘rtp-live-record’ and the codecs I am using are H.264 and AAC.

A bigger .flv file gets created with the audio / video data as expected and it can be played out in VLC.

Requirement and Issues:

I need to convert the .flv thumbnail to .jpg thumbnail now. For this, I tried using the FFmpeg binaries as suggested in some Wowza posts. But when I do this with the single frame .flv file, I am getting the following error in command prompt:

Command:

ffmpeg -i mob1@nist.gov_mob2@nist.gov_3739073223_20090304_145240_1236158594066.flv -vframes 1 -an -f rawvideo -s 160x120 -y mob1@nist.gov_mob2@nist.gov_3739073223_20090304_145240_1236158594066.jpg

Logs:

FFmpeg version SVN-r15815, Copyright © 2000-2008 Fabrice Bellard, et al.

configuration: --enable-memalign-hack --enable-postproc --enable-swscale --ena

ble-gpl --enable-libfaac --enable-libfaad --enable-libgsm --enable-libmp3lame –

enable-libvorbis --enable-libtheora --enable-libx264 --enable-libxvid --disable-

ffserver --disable-vhook --enable-avisynth --enable-pthreads

libavutil 49.12. 0 / 49.12. 0

libavcodec 52. 3. 0 / 52. 3. 0

libavformat 52.23. 1 / 52.23. 1

libavdevice 52. 1. 0 / 52. 1. 0

libswscale 0. 6. 1 / 0. 6. 1

libpostproc 51. 2. 0 / 51. 2. 0

built on Nov 13 2008 10:28:29, gcc: 4.2.4 (TDM-1 for MinGW)

[h264 @ 0x28a7f0]no frame!

[flv @ 0x289660]Could not find codec parameters (Video: h264, yuv420p)

[flv @ 0x289660]Could not find codec parameters (Audio: 0x0000, 0 channels, s16)

mob1@nist.gov_mob2@nist.gov_3739073223_20090304_145240_1236158594066.flv: could not find codec parameters

Can you please let me know why I am getting such an error from FFmpeg? Am I doing anything wrong?

Regards,

Senthil

Thanks Charlie.

Do you have any idea when a similar system will be available for H.264 video?

Regards,

Senthil

Thanks Charlie.

Do you have any idea when a similar system will be available for H.264 video?

Regards,

Senthil

Also, please let me know if there is any work around for achieving this now - like extending the code in createSnapshotLive()?

Charlie,

The Java code modifications didn’t helped out… Still I am getting ‘could not find codec parameters’ error.


FFmpeg Command Executed:

**********************

ffmpeg -i mob1@nist.gov_mob2@nist.gov_1365117498_20090306_112948_1236319204062.flv

-an -r 1 -y video%d.jpg

FFmpeg command line Output:

***********************

skipping flv packet: type 18, size 75, flags 0

skipping flv packet: type 9, size 41, flags 23

skipping flv packet: type 9, size 1597, flags 23

mob1@nist.gov_mob2@nist.gov_1365117498_20090306_112948_1236319204062.flv: could not find codec parameters

************************************************************

Regards,

Senthil

Hi Charlie,

Even with your FFmpeg command, its not working for me. So, I doubt the version of FFmpeg I am using. Can you let me know the FFmpeg version you are using?

Regards,

Senthil

Hi Charlie,

Even with your FFmpeg command, its not working for me. So, I doubt the version of FFmpeg I am using. Can you let me know the FFmpeg version you are using?

Regards,

Senthil

Please try with my .flv thumbnail file (mob1@nist.gov_mob2@nist.gov_923701976_20090306_110839_1236317978936.flv) created by Wowza using the WMSSnapShotCreator module. I am sending this as an email to you…

Thanks Charlie,

Yes - I too got that working with the version of FFmpeg you pointed out.

Regards,

Senthil

Hi Charlie,

I am using livestreamrecord application to capture video from a webcam. The video application and audio only applications are working perfectly. However when I try to use “Snapshot” functionality, nothing is being captured. If I understood right, the snapshot functionality generates an flv file with frames captured. However I am not seeing any flv file for the snapshot. I have integrated livestreamrecord with JWPlayer. I am reproducing below the code being used:

The flash side actionscript snippet is as follows

function snapshot()
{
	var resultObj:Object = new Object();
	resultObj.onResult = function(fileName:String)
	{
		trace("result: "+fileName);
	}
	nc.call("createSnapshotLive", resultObj, "test");
	
}
doSnapshot.onPress = snapshot;

The application file contains the following:

<Root>
	<Application>
		<!-- Uncomment to set application level timeout values
		<ApplicationTimeout>60000</ApplicationTimeout>
		<PingTimeout>12000</PingTimeout>
		<ValidationFrequency>8000</ValidationFrequency>
		<MaximumPendingWriteBytes>0</MaximumPendingWriteBytes>
		<MaximumSetBufferTime>60000</MaximumSetBufferTime>
		<MaximumStorageDirDepth>25</MaximumStorageDirDepth>
		-->
		<Connections>
			<AutoAccept>true</AutoAccept>
			<AllowDomains></AllowDomains>
		</Connections>
		<!--
			StorageDir path variables
			
			${com.wowza.wms.AppHome} - Application home directory
			${com.wowza.wms.ConfigHome} - Configuration home directory
			${com.wowza.wms.context.VHost} - Virtual host name
			${com.wowza.wms.context.VHostConfigHome} - Virtual host config directory
			${com.wowza.wms.context.Application} - Application name
			${com.wowza.wms.context.ApplicationInstance} - Application instance name
			
		-->
		<Streams>
			<StreamType>live-record</StreamType>
			<StorageDir>D:/apache-tomcat-6.0.18/webapps/shrisemr/mm/content</StorageDir>
			<Properties>
				<!-- Properties defined here will override any properties defined in conf/Streams.xml for any streams types loaded by this application -->
				<!--
				<Property>
					<Name></Name>
					<Value></Value>
				</Property>
				-->
			</Properties>
		</Streams>
		<SharedObjects>
			<StorageDir></StorageDir>
		</SharedObjects>
		<Client>
			<IdleFrequency>-1</IdleFrequency>
			<Access>
				<StreamReadAccess>*</StreamReadAccess>
				<StreamWriteAccess>*</StreamWriteAccess>
				<StreamAudioSampleAccess></StreamAudioSampleAccess>
				<StreamVideoSampleAccess></StreamVideoSampleAccess>
				<SharedObjectReadAccess>*</SharedObjectReadAccess>
				<SharedObjectWriteAccess>*</SharedObjectWriteAccess>
			</Access>
		</Client>
		<RTP>
			<!-- RTP/Authentication/Methods defined in Authentication.xml. Default setup includes; none, basic, digest -->
			<Authentication>
				<Method>digest</Method>
			</Authentication>
			<!-- RTP/AVSyncMethod. Valid values are: senderreport, systemclock, rtptimecode -->
			<AVSyncMethod>senderreport</AVSyncMethod>
			<MaxRTCPWaitTime>12000</MaxRTCPWaitTime>
			<Properties>
				<!-- Properties defined here will override any properties defined in conf/RTP.xml for any depacketizers loaded by this application -->
				<!--
				<Property>
					<Name></Name>
					<Value></Value>
				</Property>
				-->
			</Properties>
		</RTP>
		<MediaCaster>
			<Properties>
				<!-- Properties defined here will override any properties defined in conf/MediaCasters.xml for any MediaCasters loaded by this applications -->
				<!--
				<Property>
					<Name></Name>
					<Value></Value>
				</Property>
				-->
			</Properties>
		</MediaCaster>
		<MediaReader>
			<Properties>
				<!-- Properties defined here will override any properties defined in conf/MediaReaders.xml for any MediaReaders loaded by this applications -->
				<!--
				<Property>
					<Name></Name>
					<Value></Value>
				</Property>
				-->
			</Properties>
		</MediaReader>
		<!-- 
		<Repeater>
			<OriginURL></OriginURL>
		</Repeater> 
		-->
		<Modules>
			<Module>
				<Name>snapshot</Name>
				<Description>Create Snapshot</Description>
				<Class>com.wowza.wms.plugin.test.module.CreateSnapshot</Class>
			</Module>
			<Module>
				<Name>base</Name>
				<Description>Base</Description>
				<Class>com.wowza.wms.module.ModuleCore</Class>
			</Module>
			<Module>
				<Name>properties</Name>
				<Description>Properties</Description>
				<Class>com.wowza.wms.module.ModuleProperties</Class>
			</Module>
			<Module>
				<Name>logging</Name>
				<Description>Client Logging</Description>
				<Class>com.wowza.wms.module.ModuleClientLogging</Class>
			</Module>
			<Module>
				<Name>livestreamrecord</Name>
				<Description>Live Stream Record</Description>
				<Class>com.wowza.wms.plugin.livestreamrecord.ModuleLiveStreamRecord</Class>
			</Module>
			<Module>
				<Name>flvplayback</Name>
				<Description>FLVPlayback</Description>
				<Class>com.wowza.wms.module.ModuleFLVPlayback</Class>
			</Module> 
		</Modules>
		<Properties>
			<!-- Properties defined here will be added to the IApplication.getProperties() and IApplicationInstance.getProperties() collections -->
			<!--
			<Property>
				<Name></Name>
				<Value></Value>
			</Property>
			-->
		</Properties>
	</Application>
</Root>

Thanks in advance for your help,

Regards,

Kishore

I try to build this custom snapshot module, but get following errors…

•The method getStreamFileForWrite(String, null, null) is undefined for the type IMediaStream

•The method getVideoCodecConfigPacket() in the type IMediaStream is not applicable for the arguments (long)

•The method isVideoKeyFrame(AMFPacket) is undefined for the type FLVUtils

Am I missing something or what?

What version of Wowza Media Server are you building the code against? It needs to be the latest 1.7.2.

Charlie

I updated 1.6.0 → 1.7.2 and now it’s working fine. Thanks!

We have installed this custom module, and it works perfect to create snapshots for streams that are being broadcasted through flash! However we’re stuck on how to create snapshots for broadcasts coming from for example Adobe Flash Media Encoder. Would there be a way to trigger the createsnapshot event on the serverside for an active broadcast by any chance? Any input is highly appreciated.

Without some input from a user, it would have to be done on a timer.

But have you tried it with FMLE? If you know the name of the stream and you are connected to the same application, calling this function from Flash app while FMLE is publishing should work.

Richard

We are currently already doing it with a timer within the broadcasting application in flash. But to extract thumbnails for streams that are being broadcasted with an application (such as FME) we would like to do it entirely on the server side. Would there be a way to trigger the createsnapshot command from the serverside for a certain stream?

We’ll look into this, thanks a lot Richard!

Hi,

I am using ffmpeg and i am able to create thumbnails for regular mp4 movie files that i download from the internet. But when i try the same with the Extremists sample that i got from the Media Server downloads, it doesn’t work.

I noticed that the Extremists file has an extension .m4v (mp4 video). How is this different and how can i use ffmpeg to create a thumbnail for this?

Moreover when the live streams are recorded on the Media Server, what is the extension that they normally have(in case of a h264 video stream)?

Thanks,

Nish.

Richard and Charlie, thankyou for quick replies

I am using a JW Flash Player at the user end for my application. and the flashvars that i am using is like

streamer=rtmp://[wowza ip address]:1935/rtplive&file=udp://[stream-ip]:10000

and i want to write a VOD interface which accesses the recorded files in the content folder on the Wowza server. So to build a gallery in this page, i want to make sure i am writing suitable code for the right format of files. If it is an flv then i wouldn’t have a problem. I am using a HaiVision Makito encoder and H.264 video. So would they still be stored in .flv format?

Thanks!!

If i do use a mp4 as a prefix, can you tell me the extension that the stored file would have?

I want to make sure that it is playable in JW Flash Player. They cannot play .mp4 extensions but can play .m4v files.

I am trying to populate a gallery with VOD files on the Wowza Server. My client application uses ffmpeg to create thumbnails from the VOD videos (on the Wowza) and use them in the gallery. Thumbnail generation is done at runtime on the client application.

How can i access the VOD files on the Wowza Server from the client application in order to generate their thumbnails?

I am using this for thumbnail generation:

ffmpeg -i inputfile -vframes 1 -ss 00:00:07 -s 150x150 thumbname.jpg

Here in place of the inputfile i am trying to give the path to the video file.

Can you help me out on this? It would also help me greatly if you can tell me how to iterate through all the VOD files on Wowza so as to get their filenames and access them for thumbnail generation.

Do you have an example of how to call the server side function from client?

My client application is in ASP.NET and C#. Do i have to call the function using Action Script? Or can i call it from the C# code? I had a tough time trying to use NetConnection from the C# code. I couldn’t find a way of doing it.

I have followed the read me and installed the module in the application and changed the Application.xml file accordingly. But i do not understand how to proceed from there. Any help in this regard would be appreciated.