Hello,
We are getting a stream from an ip camera with rtp over tcp (interleaved), them we stream it to flash over RTMP. We are experiencing a weird problem where even on local networks, where bandwidth is not a problem. Streams without audio work perfectly, but streams with audio (G.711 ALAW) work for about 20 seconds, then freeze for 2-3 seconds, start again work for a shorter time until at one point it freezes altogether. Throughout this process “buffer length” in flash gets longer and longer. Fiddling with buffer values in VHost.xml changes the intervals but nefer fixes the problem.
A point we notice is that audio starts immediately, even before live buffer is filled, but video stars after a few seconds. Seems like a synch issue but none of the methods in wowza forums work (sortbuffer, jitterbuffer etc.).
Our two questions are:
1.What are the relevant buffers that we can configure in VHost.xml while getting the stream from camera . Do we use RTP/DatagramConfiguration/Incoming, HostPort/SocketConfiguration/ReceiveBufferSize or does interleave have it’s own configuration?
- Does “recordWaitForVideoKeyFrame” work on live streams and if not is there something similar to filter out audio before the first keyframe
Thanks,
Özgür