How to improve webcam live video streaming quality

Hello,

I’m implementing a ( flash client with webcam --> Wowza server --> flash client ) solution in order to replace a live application with an Axis camera.

It appears that the quality of the live stream viewed on clients side is very poorer than with Axis cam.

In the flash client publishing the stream, the camera.width and camera.height properties appear to be only 160 x 120 , compared to 640 x 480 with the Axis camera !! Oups …

So the final result is a bad pixelation.

Does anyone have an idea about how to improve the final quality of the live stream ?

Thanks a lot.

Philippe.

You can use Camera.setMode and Camera.setQuality:

camera.setMode(320,240,20,false); // set size here
camera.setQuality(0,90); // this favors quality over bandwidth

This is as exact as you can get using Flash plugin as the encoder. You should be able to get better quality from the same camera with a dedicated encoder.

Note, don’t set the 2nd arg of setQuality (quality setting) above 90. It is known to be a problem.

Richard

It is hard to know. I have found through expirementation that using the quality setting to control quality rather than bandwidth produces better looking video from the encoder. I am just not sure why although it does make sense.

Charlie

I think you are confusing the recorded file with the stream name used for live streaming. For live streaming there is no extension. The stream name is fixed. So there is no need to change the stream name used when playing a live stream when you switch between H.264 and VP6. Use the same stream name in JW Player that is working for VP6. It will also work for H.264.

Charlie

These are Flash actionscript commands. In the VideoRecording client and clientAS2 examples you can find them in the Actionscipt pane of the FLA file. In the clientFlex version they are in RecordingController.as file.

Richard

I thought you were referring to the camera.setQuality and camera.setMode lines. Those are Actionscript commands, not relevant to FMLE.

Richard

No, these settings are relevant to publishing with a Flash application, using NetStream.publish with the Camera object.

Richard

This is probably nothing new: First, use h.264.

Then, if internet delivery and player technologies are not factors, make the bitrate and fps and encoding parameters very high.

If internet delivery and a variety of player technologies are factors, then you have to compromise and play around with all those settings until you find something that first doesn’t buffer, because that’s worse then low resolution.

Richard

(I have noticed that when I set the bitrate higher than 500, I get a lot of phone calls about buffering?)

That’s the trade-off between resolution and user experience, as I was saying. Not buffering is more important than higher resolution.

And the advantage of h.264 is that you can get the same quality with lower bitrates. Make some recordings of equal time length, and compare size on disk and kbs.

size in kilobytes * 8 / duration in seconds = kilobits per second (kbs)

Richard

That message is about the extension of the file name in the “Save to File” box. You can uncheck that. That is saving a file to your computer, it is not what is being streamed.

The name of the file you are streaming is above that, in a box labeled “Stream:” (below the FMS URL and Backup URL).

It is okay to not use any extension. Just enter “myStream” for example. Then play that with h.264 settings. Test in SimpleVideoStreaming example player:

Server: rtmp://[wowza-address]:1935/[appName]

Stream: myStream

Richard

Whatever you put in the “Stream:” box in FMLE is exactly what you should reference in the JW Player “file” flashvar. Exactly. If you want to put an extension, you can, but it’s really not necessary. You should add Flashvar “provider=rtmp”

If you have this:

&stream=rtmp://[wowza-address]:1935/livevideo&provider=rtmp&file=BOD

Then the Stream: in FMLE should be “BOD”

Richard

For reference, take a look at the VideoRecording example that ships with Wowza, in the videorecording.fla file, at the StartCamera function:

function startCamera()
{	
	// get the default Flash camera and microphone
	camera = Camera.getCamera();
	microphone = Microphone.getMicrophone();
	// here are all the quality and performance settings
	camera.setMode(160, 120, 30, false);
	camera.setQuality(0, 88);
	camera.setKeyFrameInterval(30);
	microphone.rate = 11;
	
	nameStr.text = movieName;
	AppendCheckbox.selected = false;
	connect.connectStr.text = serverName;
	connect.connectButton.addEventListener(MouseEvent.CLICK, doConnect);
	enablePlayControls(false);
}

Richard

You don’t have to de-compile anything because there is source code in the example. The VideoRecording has a clientFlex folder with Flex (sdk 3.2) version including source.

Richard

Yes you do.

Richard

how does choosing higher resolution over bandwidth affect the end users’ performance?

I would like to improve picture quality as well currently using 3CCD SD digital cameras into a switcher, then into capture card.

what file is this setting located in?

Thanks for the reply, Richard.

I am not familiar with what you mean by:

VideoRecording client

clientAS2 examples

or clientFlex version

does the fact that I use FMLE to encode make a difference?

Is this only available in vers. 2.0?

Also, I am still using 1.7.xxx version - I do plan to upgrade, but need to find a time when I can shut off the server for a few days and not impact others.

well, I was…sort of.

I did a search of the forum about “improving quality” and this was the only posting I could see in reference to improving quality, so I am trying to understand what the original poster was doping, so that I can also apply the same settings to my streams if possible.

Currently using the JWPlayer vers 4 - would I be able to apply these settings using that player? The forum posts show some issues with vers. 5, so I have not upgraded my player yet.

thanks in advance -

ok…well at the risk of hijacking a thread - is there anything I can do to improve quality of video?

I did try H.264 once, and couldn’t see much of a difference, so maybe I didn’t change the settings correctly?

Format: H.264

Frame Rate: 29.97

Input Size: 320x240

Output Size: 480 x290

Bitrate: I am usually averaging 300 kbps up using wireless, 500+ hardwired.

In the display page I use f4v instead of flv to call the stream up.

(I have noticed that when I set the bitrate higher than 500, I get a lot of phone calls about buffering?)

I use live-record in my Application settings, and I also record a file locally using SaveToFile.

do you have any suggestions?