You can create an HTTProvider version of this application module.
https://www.wowza.com/docs/http-providers
You need the IDE:
http://wowza.com/ide.html
It would take some understanding of Wowza Application module and HTTProviders. There is a pretty good example of an HTTProvider version of an application Module in the LiveStreamRecord addon
Richard
Looking at the code, I think it will, but the output will be one frame flv.
Richard
No, I am pretty sure that is not possible.
Richard
However, on the camera side, try setting key frame interval to 60, and the FPS to 30. Then you will have a key frame every 2 seconds
Richard
You could lower bitrate or higher compression setting (If Axis cam) so you can set fps higher. I don’t know another solution otherwise
Richard
It opens in FFprobe and should be suitable to make a jpg from.
Richard
hi,
this should work for you
ffmpeg -i yourvideo.flv -s 160x120 -ss 00:00:02 -f mjpeg -t 0.001 -y yourthumbnail.jpg
hope this helps
koz
in fact this will work better, giving you better compresison on the JPEG
ffmpeg -i yourvideo.flv -vcodec mjpeg -vframes 1 -an -f rawvideo -s 160x120 -ss 00:00:02 -y yourthumbnail.jpg
this takes an image from two seconds in to the video and creates a 160x120 jpeg
hope this helps,
koz
thx, I’m really stupid. I don’t see it
a new problem, it’s not relative to wowza server but to conversion with ffmpeg. I usually use :
ffmpeg -i file.flv -vcodec png -vframes 1 -ss 0:0:5.000 -an -f rawvideo -y snapshot.png
something which usally works fine. With the last version of wowza, there is now support for h264 so i try it and it’s really usefull. Streaming get a really better quality . But when I want to generate snapshots from those video, the command failed.
FFmpeg version SVN-r12665, Copyright (c) 2000-2008 Fabrice Bellard, et al.
configuration: --enable-gpl --enable-postproc --enable-swscale --enable-avfilt
er-lavf --enable-pthreads --enable-liba52 --enable-avisynth --enable-libfaac --e
nable-libfaad --enable-libgsm --enable-memalign-hack --enable-libmp3lame --enabl
e-libnut --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid
--cpu=i686 --extra-ldflags=-static
libavutil version: 49.6.0
libavcodec version: 51.54.0
libavformat version: 52.13.0
libavdevice version: 52.0.0
built on Apr 2 2008 22:35:11, gcc: 4.2.3
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
...
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
[flv @ 00AA6C70]Unsupported video codec (7)
Seems stream 0 codec frame rate differs from container frame rate: 1000.00 (1000
/1) -> 25.00 (25/1)
Input #0, flv, from 'file.flv':
Duration: 00:00:50.3, start: 0.000000, bitrate: 80 kb/s
Stream #0.0: Video: 0x0007, 25.00 tb(r)
Stream #0.1: Audio: mp3, 22050 Hz, stereo, 80 kb/s
picture size invalid (0x0)
Cannot allocate temp picture, check pix fmt
it seems for me that ffmpeg doesn’t support H264, this is the only thing which change since before. But surprise, the documentation and official website of ffmpeg clearly say that flv & h264 is supported. If somebody get and idea or already get this error.
is it possible to show captured frame in the flash client?
I want to save snapshots of live streams to the database so that later i can see them from the admin panel which is a flex client.
Thanks
I modified the Java code in the first post. See if that does the trick. I did not have time to test it.
Charlie
Hi Charlie,
where do I find the code snippet, mentioned above?
Regards,
Markus
Hi Richard,
I use the code you at the first page for creating snapshots.
The problem is that this code don’t seems to work for recorded H264-Streams.
On top of page 3 Charlie mentioned, that he has modified
the code on page 1, so that ist worked for H264 also.
I’am especially interested in this modifications.
For the sake of completeness follows the code I use at the moment for creating snapshots (it is a slightly modified version of the code for createSnapshotVOD(…) on page 1):
public void createFlvSnapshot(Snapshot snapshot)
{
long timecodeInMillis = Integer.valueOf(snapshot.getStreamPosInSec())*1000;
if (!snapshot.getSnapshotDirectory().isDirectory()){
snapshot.getSnapshotDirectory().mkdirs();
}
if (snapshot.getFlvFile().exists())
{
log.info("create flv snapshot from: "+snapshot.getFlvFile().getAbsolutePath()+" dst: "+snapshot.getFlvSnapshotFile().getAbsolutePath());
AMFPacket lastVideoKeyFrame = null;
try
{
BufferedInputStream is = new BufferedInputStream(new FileInputStream(snapshot.getFlvFile()));
FLVUtils.readHeader(is);
AMFPacket amfPacket;
while ((amfPacket = FLVUtils.readChunk(is)) != null)
{
if (lastVideoKeyFrame != null && amfPacket.getTimecode() > timecodeInMillis)
break;
if (amfPacket.getType() != IVHost.CONTENTTYPE_VIDEO)
continue;
if (FLVUtils.getFrameType(amfPacket.getFirstByte()) == FLVUtils.FLV_KFRAME)
lastVideoKeyFrame = amfPacket;
}
is.close();
}
catch (Exception e)
{
log.error("Error: createSnapshotVOD: reading flv: "+e.toString());
}
if (lastVideoKeyFrame != null)
{
try
{
BufferedOutputStream out = new BufferedOutputStream(new FileOutputStream(snapshot.getFlvSnapshotFile(), false));
FLVUtils.writeHeader(out, 0, null);
FLVUtils.writeChunk(out, lastVideoKeyFrame.getDataBuffer(), lastVideoKeyFrame.getSize(), 0, (byte)lastVideoKeyFrame.getType());
out.close();
}
catch (Exception e)
{
log.error("Error: createSnapshotVOD: writing flv: "+e.toString());
}
}
}
}
Hi Charlie,
AMFPacket codecConfig = stream.getVideoCodecConfigPacket(packet.getAbsTimecode());
how do I extract such a codecConfig Packet from a recorded h264 file?
Regards
Markus
Hi Charlie,
I want to take a snapshot from a FLV-File which was recorded some time ago.
Therefore I don’t have a IMediaStream Object at the
time I want to create the snapshot.
My first step is to “load” the file via:
BufferedInputStream is = new BufferedInputStream(new FileInputStream(snapshot.getFlvFile()));
Then I search the AMFPacket at the given timecode by:
FLVUtils.readHeader(is);
AMFPacket amfPacket;
while ((amfPacket = FLVUtils.readChunk(is)) != null)
{
if (lastVideoKeyFrame != null && amfPacket.getTimecode() > timecodeInMillis)
break;
if (amfPacket.getType() != IVHost.CONTENTTYPE_VIDEO)
continue;
if (FLVUtils.getFrameType(amfPacket.getFirstByte()) == FLVUtils.FLV_KFRAME)
lastVideoKeyFrame = amfPacket;
}
is.close();
In a last step I write this packet (lastVideoKeyFrame) to a new flv-File.
The code for this step is:
BufferedOutputStream out = new BufferedOutputStream(new FileOutputStream(snapshot.getFlvSnapshotFile(), false));
FLVUtils.writeHeader(out, 0, null);
//FLVUtils.writeChunk(out, lastVideoKeyFrame.getDataBuffer(), lastVideoKeyFrame.getSize(), 0, (byte)lastVideoKeyFrame.getType());
AMFPacket codecConfig = [COLOR="Red"]stream[/COLOR].getVideoCodecConfigPacket(packet.getAbsTimecode());
if (codecConfig != null){
FLVUtils.writeChunk(out, codecConfig.getDataBuffer(), codecConfig.getSize(), 0, (byte)codecConfig.getType());
}
FLVUtils.writeChunk(out, lastVideoKeyFrame.getDataBuffer(), lastVideoKeyFrame.getSize(), 0, (byte)lastVideoKeyFrame.getType());
out.close();
Regards,
Markus
Hi Charlie,
grabbing a image directly from a saved flv-file by ffmpeg
is much slower than creating a mini-flv and convert this mini-flv to an image afterwards.
Regards,
Markus
Yesterday, I tried to grab 3 thumbnails (beginning, middle, end) from a recorded FLV file (H264), using ffmpeg in an external command line. Everything works fine.
Now I’m wondering about ffmepg behaviour when grabbing frames. A few years ago, when I tried this, I notice ffmpeg browse the whole file to seek in the file. When the FLV file is 2 hours long for instance, it could be really slow !
Let’s say FLVUtils offer a native PNG/JPG export function of a given frame, what would be FLVUtils behaviour in this case ?
No matter, your module works great and the feature is really impressive (using ffmpeg to convert to png then delete flv frame) …
Thank you !
As I know, NetConnection object is ActionScript.
You can’t call server-side functions with C#, you need either Flash or Flex piece of code …
how do you use it on AS3… ¿? nc.call() ? any example… please…