Can I read SMPTE timecodes in an HLS stream?

I don’t really know what I’m talking about, so I’m going to start from the beginning.

We stream live video through a workflow Camera -> Video Encoder -> satellite (as h.264 over mpeg-ts UDP) -> Wowza -> Client. Clients add comments which are linked to instants in the video (the process is basically similar to what happens if you include a time like “1:23” in a comment on a Youtube video, if I remember it correctly). We want the comments’ timestamps to be accurate enough that whatever event the comment describes is still on-screen, but we don’t need frame-level accuracy (and we can’t achieve that anyway, because the comments don’t preserve milliseconds).

The issue we face is that the satellite link introduces an unknown but large amount of latency: several seconds at best; several minutes delay is not unheard-of. This is too much for us.

We know that the video encoder writes SMPTE timecodes to the low-resolution stream, and we believe that this time is accurate. The client (JWPlayer) is able to read #EXT-X-PROGRAM-DATE-TIME playlist tags, so we want to convert one to the other. I realize that the SMPTE timecode is attached to individual frames while the #EXT-X-PROGRAM-DATE-TIME is attached to chunklists, so we don’t need to read every single timecode - the first timecode in a chunk is sufficient.

I’m running an old mp4 file as a live stream using the instructions in Publish a video file as a live stream with Wowza Streaming Engine, and I’m trying to read the timecodes with custom modules based on a few articles:

  • Control the display of program date and time headers in HLS chunklists for Wowza Streaming Engine live streams
  • Listen for stream events and codec information with the Wowza Streaming Engine Java API

(I’d link to those articles, but as a new forum user I’m not allowed to.)

These APIs capture a variety of timecodes, but none of the timecodes I’ve been able to read appear to be based on the SMPTE timecode.

I’ve also read HLS Live stream Timecode issue, but the suggestions there involve using the Wowza server’s time, which we know isn’t accurate for our purposes.

Does Wowza have an API that allows me to read the SMPTE timecodes? If not, is there any API that exposes the raw stream, so I can extract them myself?

Hi @Patrick_Conley first of all thank you so much for sharing details of the workflow and what you’ve tried so we can try to assist you. I’ve checked in with an engineer who specializes in this area and will let you know what I find out. Be back soon!

The timing data may be available as picture timing SEI Messages, in which case, you may be able to use a module with the following interface to listen for these messages.

https://www.wowza.com/resources/serverapi/com/wowza/wms/stream/IMediaStreamH264SEINotify.html

The other place the data may be, is in a separate pid in the MpegTS stream. In that case, you could listen for data on that pid. In both cases, the timing format should be HH:MM:SS:FF so you could use that to create the ProgramDateTime based on encoder time instead of server time. You really just need a single timestamp that you can then use as an offset that you apply to the stream timecode when the HLS chunk is created.

https://www.wowza.com/docs/how-to-monitor-mpeg-ts-ingestion-to-process-additional-data-streams-scte-35-klv-etc

and see this also:


You are welcome to seek assistance in our Hire a Consultant forum or reach out to Wowza Professional Services for paid help.

The timing data is stored in picture timing SEI messages. I was misled originally by ffprobe, which shows the timestamps correctly with -show_frames, but incorrectly describes them as SMPTE 12-1 timecodes. I was able to read the timestamps with IMediaStreamH264SEINotify using this module:

public class MediaStreamActionNotifyModule extends ModuleBase {

    private class SeiListener implements IMediaStreamH264SEINotify {

        @Override
        public void onVideoH264Packet(IMediaStream stream, AMFPacket packet, H264SEIMessages seiMessages) {
            for (H264SEIMessage message : seiMessages.getMessagesByPayloadType(H264SEIMessage.MESSAGETYPE_PICTTIMING)) {
                byte[] picTimingPayload = Arrays.copyOfRange(message.getPayloadBuffer(), message.getPayloadOffset(), message.getPayloadOffset() + message.getPayloadLen());
                getLogger().info("picture timing message: " + Arrays.toString(picTimingPayload));
            }
        }
    }

    public void onStreamCreate(IMediaStream stream) {
        stream.addVideoH264SEIListener(new SeiListener());
    }

}   

I eventually verified that picTimingPayload is a picture timing message matching the format in ISO/IEC 14496-10 by writing it out in binary and searching it manually for the bit sequences matching the time I got from ffprobe. Unfortunately, in order to parse it automatically, it looks as though I’ll need several flags (nal_hrd_parameters_present_flag, vcl_hrd_parameters_present_flag, cpb_removal_delay_length_minus1, dpb_output_delay_length_minus1, and time_offset_length) which aren’t part of this payload. The spec says they’re stored in the video usability information of the sequence parameter set, which is in a different access unit than the SEI messages.

It looks like I can get the sequence parameter set from H264CodecConfigParts#getSps, but this class’s package isn’t documented on the website. How can I create an instance of this? Or does Wowza expose these flags elsewhere?