Hi,
I have a video and audio available in two different locations and I would like to mux together the sources to produce a single AV stream. Since the encoders arent shared the timecodes will be very different.
I intended to use FFmpeg to inject timecodes into the stream at the source (before it gets to wowza) so that i can then tie together the sources as closely as possible. I can also make adjustments in the ffmpeg codes to allow for any delays that I need to introduce.
My question is, what “timecode” does Wowza listen to? There are lots of different ways of doing that…
Thanks,
Joe