I’d like to be able to sync events to a live streams in Safari on ipad. For example, I’d like to be able to display real time closed captions that are being piped in to my Wowza app. For flash clients this is easy, but I’m not finding a clear way to do it for ipad.
From what I’ve found so far, there is no javascript analog to the native app timedMetaData event, so it seems that I can’t do it with id3 tags. So, my thinking is that I’ll need to go the route of an HTTProvider that pulls the caption data out of the app instance, and then long poll the HTTProvider on an interval from the client side.
The trouble I run into there is the varied delay any given HLS client might be experiencing. Is there a way for me to gauge what the delay is for a given client? Even better, is there a javascript event in safari that I haven’t found?
In the end, I can tolerate a few seconds of skew in the sync (I could live with 5-10 seconds)… I just need to find a way of managing it consistently.
It is possible with ID3 tags, however two problems, we don’t have a working server-side example, or client-side example of how to handle it. Not sure about your work around. You could easily fetch data about a stream with HTTProvider, but not sure how you would do cuepoints that way.
I have looked around at how to handle data from ID3 tags in iDevices, and as far as I can tell objective c is required (javascript option would be nice), but I could not find a working example. So this is non-starter at present. I know that is frustrating, I am, but I think the ball is in Apple’s court at the moment to demonstrate how this should be done in their device. Then it will make sense for us to move forward with Cupertino cuepoint integration and examples. Please let us know if you learn anything useful.
On the HTTProvider side, I already have a working model whereby the HTTProvider invokes methods in my application module. My module keeps a hashmap of event info objects, each with a timestamp or id. The HTTProvider invokes a method on the app module with a starting id or timestamp as an argument, and the function in the module iterates backwards through the hashmap to find events that are newer than the timestamp or id passed to the function, and returns a new hashmap with the needed data. That part works just fine, so I can reply on the HTTProvider to get messages to iOS clients if need be.
That said, using HTTProvider results in the client getting the new messages based on the long poll interval, and gives me no way to compensate for HLS delay time when trying to sync events to the live stream.
I also have a module that adds ID3V2FrameTextInformation objects to the id3FramesHeader as needed (which seems to be working), but what I can’t seem to find is a way to get those id3 tags out of quicktime via a javascript event in safari. From what I’ve read in the apple docs, you can do this in a native app by listening for a timedMetaData event, but in the case of safari using a
Yeah… support for anything other than “hey look at this video” in iOS is woefully lacking. Even being able to calculate the delay between what the client is seeing and the video arriving at the server would have sufficed, but alas, no love there. Perhaps Apple will get past 2001 technology before too long.