Transcoding Solution For Legacy MJPEG Cameras

See the last post for the final solution. The solution in this post has some issues. However, it can still be used after hacking ffmpeg.c.

I thought I would share my solution for streaming legacy MJPEG network cameras through Wowza. I used ffmpeg and ffserver to transcode a MJPEG stream from a Canon VC-C50i to H.264. I did this on OS-X, but this should work on any *nix machine. This solution was adapted from here.

This is the proof of concept version of a production solution.

Prerequisites

ffmpeg source

libx264

Wowza Media Server 2

If you want to test the resulting H.264 stream with VLC, then you will need to modify ffserver as outlined here.

I used ffmpeg r25013. Later releases seem to give me problems for which I have not worked out yet.

ffserver.conf

# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090

# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0
RTSPBindAddress 0.0.0.0
RTSPPort 554

# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 20000

# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 20000

# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 100000 

# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -

# Suppress that if you want to launch ffserver as a daemon.
NoDaemon

<Feed feed1.ffm>
	File /tmp/feed1.ffm #when remarked, no file is beeing created and the stream keeps working!!
	FileMaxSize 200K
       # Only allow connections from localhost to the feed.
       ACL allow 127.0.0.1
</Feed>

<Stream live.h264>
Format rtp
Feed feed1.ffm
VideoCodec libx264
VideoFrameRate 5
VideoBitRate 100
VideoSize 320x240
AVPresetVideo default
AVPresetVideo baseline
AVOptionVideo flags +global_header
NoAudio
</Stream>

<Stream stat.html>
	Format status
</Stream>

<Redirect index.html>
	# credits!
	URL http://ffmpeg.sourceforge.net/
</Redirect>

I use a php script from the command line to stream from the Canon cams and pipe into ffmpeg. These scrips were adapted from a blog post I can not find.

HTTP PHP Streamer (pull_stream_http.php). I have found for the canon cams it is more reliable to use the TCP streaming port. I will post that php script if someone wants it.

<?php
// Simple log
$fp = fopen("log.txt", "w"); 

//Run forever    
while(true) {
    $err_no = TRUE ;
    $contents = '';

    $handle = fopen("http://CAMERA_IP:PORT/-wvhttp-01-/getoneshot?frame_count=0", "rb");
    
    if($handle === FALSE) {
        fwrite($fp, "Connect failed - Retrying\r\n") ;
        sleep(100) ;
        continue ;
    }
    
    fwrite($fp, "Connected to camera\r\n") ;
    
    //Test for EOF and exit if the pipe is closed
    while (!feof($handle)) {
        print fread($handle, 8192);
    }
    fclose($handle);

    fwrite($fp, "Connection Failed - Sleeping\r\n") ;
    sleep(100) ;
}
?>

The two shell scripts

server.sh

#!/bin/bash
while [ 1 ]
do
  ffserver -loglevel verbose
done

stream.sh

#!/bin/bash
while [ 1 ]
do

php -f pull_stream_http.php | ffmpeg -an -f mjpeg -maxrate 500 -r 6 -i - 6 http://localhost:8090/feed1.ffm

done

Now just start the server and then the stream. You should be able to connect to rtsp://localhost:554/live.h264 with VLC (assuming you modified ffserver). If you do not see a stream check the ffserver console log. If you do not see a request then ffserver has not opened port 554. Make sure the user level you are running at allows access to this port.

After you confirm it is streaming all you need to do is setup an application in Wowza. Set the content stream to rtsp://localhost:554/live.h264

You could simply use ffserver to do all the streaming, but I have found Wowza to be much more reliable. This also allows you to move the transcoding to another machine.

I hope this helps.

A single command way to do it. From commenter Jose Antonio Caso Jacobs at this blog

ffmpeg -er 4 -y -r 5 -f mjpeg -an -i http://CAMERA_IP/-wvhttp-01-/getoneshot?frame_count=0 -vcodec libx264 -vpre fastfirstpass -vpre baseline -b 24k -bt 32k -threads 0 -f rtsp rtsp://127.0.0.1:1935/rtplive/myStream.sdp

Or

php -f pull_stream_http.php | ffmpeg -er 4 -y -r 5 -f mjpeg -an -i - -vcodec libx264 -vpre fastfirstpass -vpre baseline -b 24k -bt 32k -threads 0 -f rtsp rtsp://127.0.0.1:1935/rtplive/myStream.sdp

This works really well if you are going to stream the camera full on. I have not found a good way to limit the stream to a single frame per second and keep the iPhone stream from timing out (Flash also takes forever to buffer). Using the method in the first post you can ‘fake’ a faster fps in the resulting stream by setting VideoFrameRate to a higher number then the actual input stream.

Cool! Happy to hear you have it sorted out.

Charlie

I had some serious issues with the above solution. The issue is ffmpeg was not sending out key frames as I had original thought when using the -g option. This lead to wowza client RTSP streams timing out, and long load times for RTMP. The latest ffmpeg release has the -force_key_frames option. Modifying this option in ffmpeg.c to put in key frames every Nth frame fixed the issue.

The hack of ffmpeg.c

In do_video_out(…) comment out

if (ost->forced_kf_index < ost->forced_kf_count &&
    big_picture.pts >= ost->forced_kf_pts[ost->forced_kf_index]) {
    big_picture.pict_type = FF_I_TYPE;
    ost->forced_kf_index++;
}

and replace with

if (ost->forced_kf_index > ost->forced_kf_pts[0] ) {
    big_picture.pict_type = FF_I_TYPE;
    ost->forced_kf_index = 0 ;
}
else {
    ost->forced_kf_index++;
}

Keep in mind that this hack means you have to specify -force_key_frames. -force_key_frames now only takes 1 parameter. The parameter specifies the number of images between a key frame. A proper command line option should be added to ffmpeg. This is just a test hack.

My command line argument is now

php -f ../pull_stream_http.php IP_ADDRESS 2000 | ffmpeg \
-er 4 -y -r 2 -f mjpeg -an  -force_key_frames 2 -i - -vcodec libx264 \
-vpre fastfirstpass -vpre baseline -b 30k -bt 32k -threads 0 -f \
rtsp rtsp://user:pass@WOWZA_IP:1935/rtplive/test.stream

This works really good at low bitrates and FPS.

I modified ffmpeg.c to add the option -force_nth_key_frames . Send me a pm if you would like it. This option overrides the -force_key_frames option in my impl. This ffmpeg.c is from SVN-r25862.

Any one tried this successfully with the latest ffmpeg code from git server?

I am now having issue when running stream.sh, with error msg: “Invalid input file index: 0”. I also tried the way to modify ffmpeg.c, but with no luck.

Here’s my stream.sh:

#!/bin/bash

while [ 1 ]

do

#php -f pull_stream_http.php | ffmpeg -an -f mjpeg -maxrate 500 -b24k -r 6 -i - 6 http://localhost:8090/feed1.ffm

php -f pull_stream_http.php | ffmpeg -an -f mjpeg -r -i -vcodec libx264 -map 0 -vpre fastfirstpass -vpre baseline -b 24k -bt 32k -threads 0 http://localhost:8090/feed1.ffm

done